音视频开发——流媒体数据传输RTP(三)

来源:互联网 发布:whip it 编辑:程序博客网 时间:2024/06/05 04:09

iOS音视频开发相关文章:

音视频开发——概述(一)

音视频开发——流媒体数据传输RTSP(二)

音视频开发——流媒体数据传输RTP(三)

音视频开发——ffmpeg解码(四)


实时传送协议(Real-time Transport Protocol或简写RTP,也可以写成RTTP)是一个网络传输协议。

关于RTP协议的详细介绍,可以参考这篇文章:RTP协议介绍:http://www.cnblogs.com/qingquan/archive/2011/07/28/2120440.html

RTP协议是通过UDP层传输的,在本例中仍然使用CocoaAsyncSocket库实现UDP协议。

1、注册UDP监听

#import "RTPReceiver.h"#import "CocoaAsyncSocket/GCD/GCDAsyncUdpSocket.h"#import "RTPPacket.h"@interface RTPReceiver() <GCDAsyncUdpSocketDelegate, RTPPacketDelegate> {        int      _rtpPort;    dispatch_queue_t _rtpQueue;    GCDAsyncUdpSocket   *_rtpSocket;    RTPPacket       *_rtpPacket;}@end@implementation RTPReceiver- (instancetype)initWithPort:(int)port {    if (self == [super init]) {        _rtpPort = port;                _rtpQueue = dispatch_queue_create("rtpSocketQueue", NULL);        _rtpSocket = [[GCDAsyncUdpSocket alloc] initWithDelegate:self delegateQueue:_rtpQueue];                NSError *error;        int connectRect = [_rtpSocket bindToPort:_rtpPort error:&error];        if (!connectRect) {            NSLog(@"ERROR!!! bind upd port: %@", error.localizedDescription);        }                _rtpPacket = [[RTPPacket alloc]init];        _rtpPacket.delegate = self;    }    return self;}- (void)startReceive {        NSError *error;    [_rtpSocket beginReceiving:&error];    if (error) {        NSLog(@"ERROR!!! receive RTP: %@", error.localizedDescription);    }}#pragma mark GCDAsyncUdpSocket Delegate- (void)udpSocket:(GCDAsyncUdpSocket *)sock didReceiveData:(NSData *)data fromAddress:(NSData *)address withFilterContext:(id)filterContext {    [_rtpPacket addNalu:data];}#pragma mark RTP Packet Delegate- (void)DidPacketFrame:(uint8_t *)frame size:(int)size sequence:(int)sequ {    dispatch_async(dispatch_get_main_queue(), ^{        NSDictionary *dict = @{@"data":[NSData dataWithBytes:frame length:size],                               @"size":[NSNumber numberWithInt:size]};        [[NSNotificationCenter defaultCenter] postNotificationName:@"client" object:dict];    });}@end

2、接收到RTP数据,对其进行解析,并封装成一帧一帧

- (void)addNalu:(NSData *)rtpData {    bzero(&rtpHeaer, sizeof(rtpHeaer));            uint8_t *dataByte = (uint8_t *)[rtpData bytes];        rtpHeaer.version = (dataByte[0] & 0xc0) >> 6;    rtpHeaer.padding = (dataByte[0] & 0x20 >> 5) == 1;    rtpHeaer.extension = (dataByte[0] & 0x10 >> 4) == 1;    rtpHeaer.payloadType = dataByte[1] & 0x7f;    rtpHeaer.sequenceNumber = twoByte(dataByte + 2);    rtpHeaer.timeStamp = fourByte(dataByte + 4);        [self loadNalu:rtpData];}- (void)loadNalu:(NSData *)rtpData {    char NaluHeader[2];    [rtpData getBytes:NaluHeader range:NSMakeRange(RTPHeaderSize, 2)];    int fuIndicator = NaluHeader[0] & 0x1f;    switch (fuIndicator) {        case 7:            [sliceArray removeAllObjects];        case 8:        {            NSData *subData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize, rtpData.length - RTPHeaderSize)];            NalUnit *unit = [[NalUnit alloc] initWithData:subData size:rtpData.length - RTPHeaderSize sequence:rtpHeaer.sequenceNumber];            [sliceArray addObject:unit];        }            break;        case 28:        {            int frameType = NaluHeader[1] & 0x1f;            if (frameType == 5) {                                int frameLength = rtpData.length - RTPHeaderSize - 2;                NSData *subData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize + 2, frameLength)];                NalUnit *unit = [[NalUnit alloc] initWithData:subData size:frameLength sequence:rtpHeaer.sequenceNumber];                [sliceArray addObject:unit];                                int ser = (NaluHeader[1] & 0xe0) >> 5;                if (ser == 2) {   //010 分片结束标志                    //组装成frame,回调                    [self packetAndSendIFrame];                }            } else if(frameType == 1){                int ser = (NaluHeader[1] & 0xe0) >> 5;                if (ser == 4) {   //100 分片开始                    [sliceArray removeAllObjects];                }                                int frameLength = rtpData.length - RTPHeaderSize - 2;                NSData *subData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize + 2, frameLength)];                NalUnit *unit = [[NalUnit alloc] initWithData:subData size:frameLength sequence:rtpHeaer.sequenceNumber];                [sliceArray addObject:unit];                                if (ser == 2) {                    [self packetAndSendPFrame];                }            }        }            break;        case 1:            if (self.delegate && [self.delegate respondsToSelector:@selector(DidPacketFrame:size:sequence:)]) {                                int frameLength = rtpData.length - RTPHeaderSize + 4;                uint8_t *buf = (uint8_t *)malloc(frameLength);                memcpy(buf, startCode, 4);                NSData *fData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize, frameLength - 4)];                memcpy(buf + 4, [fData bytes], frameLength - 4);                [self.delegate DidPacketFrame:buf size:frameLength sequence:rtpHeaer.sequenceNumber];                free(buf);                buf = NULL;            }            break;    }        }

通过上述步骤,组装好的每帧数据就能送去解码播放了。


欢迎大家加入iOS音视频开发的QQ群:331753091

1 0
原创粉丝点击