音视频开发——流媒体数据传输RTP(三)
来源:互联网 发布:whip it 编辑:程序博客网 时间:2024/06/05 04:09
iOS音视频开发相关文章:
音视频开发——概述(一)
音视频开发——流媒体数据传输RTSP(二)
音视频开发——流媒体数据传输RTP(三)
音视频开发——ffmpeg解码(四)
实时传送协议(Real-time Transport Protocol或简写RTP,也可以写成RTTP)是一个网络传输协议。
关于RTP协议的详细介绍,可以参考这篇文章:RTP协议介绍:http://www.cnblogs.com/qingquan/archive/2011/07/28/2120440.html
RTP协议是通过UDP层传输的,在本例中仍然使用CocoaAsyncSocket库实现UDP协议。
1、注册UDP监听
#import "RTPReceiver.h"#import "CocoaAsyncSocket/GCD/GCDAsyncUdpSocket.h"#import "RTPPacket.h"@interface RTPReceiver() <GCDAsyncUdpSocketDelegate, RTPPacketDelegate> { int _rtpPort; dispatch_queue_t _rtpQueue; GCDAsyncUdpSocket *_rtpSocket; RTPPacket *_rtpPacket;}@end@implementation RTPReceiver- (instancetype)initWithPort:(int)port { if (self == [super init]) { _rtpPort = port; _rtpQueue = dispatch_queue_create("rtpSocketQueue", NULL); _rtpSocket = [[GCDAsyncUdpSocket alloc] initWithDelegate:self delegateQueue:_rtpQueue]; NSError *error; int connectRect = [_rtpSocket bindToPort:_rtpPort error:&error]; if (!connectRect) { NSLog(@"ERROR!!! bind upd port: %@", error.localizedDescription); } _rtpPacket = [[RTPPacket alloc]init]; _rtpPacket.delegate = self; } return self;}- (void)startReceive { NSError *error; [_rtpSocket beginReceiving:&error]; if (error) { NSLog(@"ERROR!!! receive RTP: %@", error.localizedDescription); }}#pragma mark GCDAsyncUdpSocket Delegate- (void)udpSocket:(GCDAsyncUdpSocket *)sock didReceiveData:(NSData *)data fromAddress:(NSData *)address withFilterContext:(id)filterContext { [_rtpPacket addNalu:data];}#pragma mark RTP Packet Delegate- (void)DidPacketFrame:(uint8_t *)frame size:(int)size sequence:(int)sequ { dispatch_async(dispatch_get_main_queue(), ^{ NSDictionary *dict = @{@"data":[NSData dataWithBytes:frame length:size], @"size":[NSNumber numberWithInt:size]}; [[NSNotificationCenter defaultCenter] postNotificationName:@"client" object:dict]; });}@end
2、接收到RTP数据,对其进行解析,并封装成一帧一帧
- (void)addNalu:(NSData *)rtpData { bzero(&rtpHeaer, sizeof(rtpHeaer)); uint8_t *dataByte = (uint8_t *)[rtpData bytes]; rtpHeaer.version = (dataByte[0] & 0xc0) >> 6; rtpHeaer.padding = (dataByte[0] & 0x20 >> 5) == 1; rtpHeaer.extension = (dataByte[0] & 0x10 >> 4) == 1; rtpHeaer.payloadType = dataByte[1] & 0x7f; rtpHeaer.sequenceNumber = twoByte(dataByte + 2); rtpHeaer.timeStamp = fourByte(dataByte + 4); [self loadNalu:rtpData];}- (void)loadNalu:(NSData *)rtpData { char NaluHeader[2]; [rtpData getBytes:NaluHeader range:NSMakeRange(RTPHeaderSize, 2)]; int fuIndicator = NaluHeader[0] & 0x1f; switch (fuIndicator) { case 7: [sliceArray removeAllObjects]; case 8: { NSData *subData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize, rtpData.length - RTPHeaderSize)]; NalUnit *unit = [[NalUnit alloc] initWithData:subData size:rtpData.length - RTPHeaderSize sequence:rtpHeaer.sequenceNumber]; [sliceArray addObject:unit]; } break; case 28: { int frameType = NaluHeader[1] & 0x1f; if (frameType == 5) { int frameLength = rtpData.length - RTPHeaderSize - 2; NSData *subData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize + 2, frameLength)]; NalUnit *unit = [[NalUnit alloc] initWithData:subData size:frameLength sequence:rtpHeaer.sequenceNumber]; [sliceArray addObject:unit]; int ser = (NaluHeader[1] & 0xe0) >> 5; if (ser == 2) { //010 分片结束标志 //组装成frame,回调 [self packetAndSendIFrame]; } } else if(frameType == 1){ int ser = (NaluHeader[1] & 0xe0) >> 5; if (ser == 4) { //100 分片开始 [sliceArray removeAllObjects]; } int frameLength = rtpData.length - RTPHeaderSize - 2; NSData *subData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize + 2, frameLength)]; NalUnit *unit = [[NalUnit alloc] initWithData:subData size:frameLength sequence:rtpHeaer.sequenceNumber]; [sliceArray addObject:unit]; if (ser == 2) { [self packetAndSendPFrame]; } } } break; case 1: if (self.delegate && [self.delegate respondsToSelector:@selector(DidPacketFrame:size:sequence:)]) { int frameLength = rtpData.length - RTPHeaderSize + 4; uint8_t *buf = (uint8_t *)malloc(frameLength); memcpy(buf, startCode, 4); NSData *fData = [rtpData subdataWithRange:NSMakeRange(RTPHeaderSize, frameLength - 4)]; memcpy(buf + 4, [fData bytes], frameLength - 4); [self.delegate DidPacketFrame:buf size:frameLength sequence:rtpHeaer.sequenceNumber]; free(buf); buf = NULL; } break; } }
通过上述步骤,组装好的每帧数据就能送去解码播放了。
欢迎大家加入iOS音视频开发的QQ群:331753091
1 0
- 音视频开发——流媒体数据传输RTP(三)
- 音视频开发——流媒体数据传输RTSP(二)
- 流媒体开发(三)视频播放
- RTP实时音视频数据传输环境构建
- RTP/RTCP 视频数据传输 (续)
- RTP/RTCP 视频数据传输 (续)
- RTP/RTCP 视频数据传输
- RTP/RTCP 视频数据传输
- RTP/RTCP 视频数据传输
- RTP/RTCP 视频数据传输
- RTP/RTCP视频数据传输
- RTP实时音视频数据传输,发送端和接收端
- 流媒体开发:RTP over TCP
- 流媒体开发(四)音视频的录制
- Linux音视频(流媒体)
- 流媒体学习一(RTP)
- Android WiFi开发教程(三)——WiFi热点数据传输
- 流媒体开发: RTP Header解析及定义
- linux服务器上运行java程序,引用外部jar包
- 数据结构实验之查找四:二分查找
- 网上干货 ElasticSearch详解与优化设计
- WebSocket网络接口
- FILE文件流的中fopen、fread、fseek、fclose的使用
- 音视频开发——流媒体数据传输RTP(三)
- Java IO 其他流 -- 字节数组流,字符数组流和数据流
- 关于页面刷新的那些事
- 暑期dp46道(19)HDU 1078 FatMouse and Cheese dfs+记忆化搜索
- spring中c3p0配置
- Linux文件系统十问,你知道吗?
- 用户空间与内核空间,进程上下文与中断上下文[总结]
- JAVA第一个小项目——五子棋的制作(第一篇)
- TortoiseSVN小乌龟,忽略Android studio的一些文件和文件夹