live555之PLAY指令解析

来源:互联网 发布:spss mac和spss的区别 编辑:程序博客网 时间:2024/06/05 03:40

前言

上一篇讲解了DESCRIBE指令,这一篇我们介绍下PLAY指令,当然PLAY指令开始执行后,返回后,就需要开始发送视频了,这里我们大概了解下流程。希望可以给大家一些帮助,关于live555的博客关于PLAY指令解析的最好的是live555PLAY讲解。但是这里面貌似有部分代码比较老,最新的已经覆盖掉了,这里我修正部分,并且排版也是无法忍受。有兴趣的童鞋可以参考下

正文

这里基本的处理流程和上一篇博客一样,都是在RTSPServer::RTSPClientSession中接收消息,直接处理,其他的我想大家都很容易找到黑心代码,其实就是

void RTSPServer::RTSPClientSession::handleCmd_PLAY(RTSPServer::RTSPClientConnection* ourClientConnection,    ServerMediaSubsession* subsession, char const* fullRequestStr) {    ......    fStreamStates[i].subsession->startStream(fOurSessionId,    ......    }void OnDemandServerMediaSubsession::startStream(unsigned clientSessionId,                        void* streamToken,                        TaskFunc* rtcpRRHandler,                        void* rtcpRRHandlerClientData,                        unsigned short& rtpSeqNum,                        unsigned& rtpTimestamp,                        ServerRequestAlternativeByteHandler* serverRequestAlternativeByteHandler,                        void* serverRequestAlternativeByteHandlerClientData) {  StreamState* streamState = (StreamState*)streamToken;  Destinations* destinations    = (Destinations*)(fDestinationsHashTable->Lookup((char const*)clientSessionId));  if (streamState != NULL) {    streamState->startPlaying(destinations, clientSessionId,                  rtcpRRHandler, rtcpRRHandlerClientData,                  serverRequestAlternativeByteHandler, serverRequestAlternativeByteHandlerClientData);    ......  }}void StreamState::startPlaying(Destinations* dests, unsigned clientSessionId,           TaskFunc* rtcpRRHandler, void* rtcpRRHandlerClientData,           ServerRequestAlternativeByteHandler* serverRequestAlternativeByteHandler,           void* serverRequestAlternativeByteHandlerClientData) {......    fRTCPInstance = fMaster.createRTCP(fRTCPgs, fTotalBW, (unsigned char*)fMaster.fCNAME, fRTPSink);       ......  if (dests->isTCP) {    // Change RTP and RTCP to use the TCP socket instead of UDP:    if (fRTPSink != NULL) {      fRTPSink->addStreamSocket(dests->tcpSocketNum, dests->rtpChannelId);      RTPInterface    ::setServerRequestAlternativeByteHandler(fRTPSink->envir(), dests->tcpSocketNum,                         serverRequestAlternativeByteHandler, serverRequestAlternativeByteHandlerClientData);        // So that we continue to handle RTSP commands from the client    }    if (fRTCPInstance != NULL) {      fRTCPInstance->addStreamSocket(dests->tcpSocketNum, dests->rtcpChannelId);      fRTCPInstance->setSpecificRRHandler(dests->tcpSocketNum, dests->rtcpChannelId,                      rtcpRRHandler, rtcpRRHandlerClientData);    }  }......      fRTPSink->startPlaying(*fMediaSource, afterPlayingStreamState, this);}Boolean MediaSink::startPlaying(MediaSource& source,                afterPlayingFunc* afterFunc,                void* afterClientData) {......  return continuePlaying();}Boolean H264or5VideoRTPSink::continuePlaying() {.....  return MultiFramedRTPSink::continuePlaying();}Boolean MultiFramedRTPSink::continuePlaying() {  // Send the first packet.  // (This will also schedule any future sends.)  buildAndSendPacket(True);  return True;}void MultiFramedRTPSink::buildAndSendPacket(Boolean isFirstPacket) {......  packFrame();}void MultiFramedRTPSink::packFrame() {......    fSource->getNextFrame(fOutBuf->curPtr(), fOutBuf->totalBytesAvailable(),              afterGettingFrame, this, ourHandleClosure, this);  }}

这都和之前差不多,这里我们不在详细看如何获取么一帧数据,其中传入一个afterGettingFrame这是用来从h264文件读入一帧数据后,我们需要把这些数据,传输cline端,这里我们稍微了解下。具体调用,和上一篇一样,就是到了

void ByteStreamFileSource::doReadFromFile() { //读文件,  fFrameSize = fread(fTo, 1, fMaxSize, fFid);  //在(TaskFunc*)FramedSource::afterGetting中调用我们的方法指针,然后。开始我们的具体工作。  nextTask() = envir().taskScheduler().scheduleDelayedTask(0,                (TaskFunc*)FramedSource::afterGetting, this);

下面我们开始今天的上传工作。

void MultiFramedRTPSink::afterGettingFrame(void* clientData, unsigned numBytesRead,            unsigned numTruncatedBytes,            struct timeval presentationTime,            unsigned durationInMicroseconds) {  MultiFramedRTPSink* sink = (MultiFramedRTPSink*)clientData;  sink->afterGettingFrame1(numBytesRead, numTruncatedBytes,               presentationTime, durationInMicroseconds);}void MultiFramedRTPSink::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,             struct timeval presentationTime,             unsigned durationInMicroseconds) {      sendPacketIfNecessary();      }void MultiFramedRTPSink::sendPacketIfNecessary() {...... if (!fRTPInterface.sendPacket(fOutBuf->packet(), fOutBuf->curPacketSize())) //读取下一帧数据,这里不用过于操心了。1234在来一次的东西    nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecondsToGo, (TaskFunc*)sendNext, this);  }}Boolean RTPInterface::sendPacket(unsigned char* packet, unsigned packetSize) {  //先写消息头    statsOutgoing.countPacket(bufferSize);    //在写消息内容。    statsGroupOutgoing.countPacket(bufferSize);

这里流程大概就是这样,我也不太想重新画出类图,如果谁有心思,可以好好研究下,不过整体流程都在这里,
这里其实还有两部分比较复杂的东西,关于视频文件的解析,这里不再详细研究,

后记

终于算是吧live555的源代码基本搞懂,不过这里还是很多遗憾,没有好好整理代码,以后如果需要,好好在重新研究下。

原创粉丝点击