android多媒体框架之流媒体具体流程篇1----base on jellybean(十一)

来源:互联网 发布:淘宝汽车配件名字大全 编辑:程序博客网 时间:2024/06/04 15:36

http://blog.csdn.net/tjy1985/article/details/8123515

病了两周,一吃医生开的药就加重,NND以后不去那儿看病了,最近好多了但人也懒了,也好久没有更新博文了,难道我的计划要这样的搁浅了?NO!生命不息,笔耕不辍,哈哈,有点夸大了,嘚吧嘚吧啥,进入正题.

上面我们把流媒体的框架和里面的消息机制讲了一遍,下面我们开搞流程了。我们首先探讨android里的主流支持的RTSP相关的流程。

RTSP协议相关的,不了解的,可以回头去看看:http://blog.csdn.net/tjy1985/article/details/7996121

我们知道,不管是播放本地媒体,还是流媒体,上层实现的方法都是一样的:

1:创建mediaplayer

2:setdataSource

3:prepare

4:start

5:pause

6:stop

本质的区别在于framework层,Locateplayback选用stagefrighplayert+awesomeplayer来实现,流媒体用的是nuplayer。

我们首先来看看,构造nuplayer和setdataSource都干了啥?

nuplayer的构成过程:

mediaplayerservice.cpp

 

staticsp<MediaPlayerBase> createPlayer(player_type playerType, void* cookie,

        notify_callback_f notifyFunc)

{

    void* handle;

    CreateMPQ_PlayerClientFunc funcHandle;

    sp<MediaPlayerBase> p;

    switch (playerType) {

 

………

        case NU_PLAYER:

            ALOGV(" createNuPlayer");

            p = newNuPlayerDriver;

            break;

……..

}

 

NuPlayerDriver.cpp

 

NuPlayerDriver::NuPlayerDriver()

    : mResetInProgress(false),

      mPrepareInProgress(false),

      mIsPrepared(false),

      mDurationUs(-1),

      mPositionUs(-1),

      mNumFramesTotal(0),

      mNumFramesDropped(0),

      mLooper(new ALooper),

      mState(UNINITIALIZED),

      mAtEOS(false),

      mStartupSeekTimeUs(-1) {

    mLooper->setName("NuPlayerDriverLooper");

 

    mLooper->start(

            false, /* runOnCallingThread */

            true,  /* canCallJava */

            PRIORITY_AUDIO);

 

    mPlayer = new NuPlayer;

    mLooper->registerHandler(mPlayer);

 

    mPlayer->setDriver(this);

}

 

NuPlayer.cpp

 

NuPlayer::NuPlayer()

    : mUIDValid(false),

      mVideoIsAVC(false),

      mAudioEOS(false),

      mVideoEOS(false),

      mDecoderEOS(false),

      mScanSourcesPending(false),

     mScanSourcesGeneration(0),

      mTimeDiscontinuityPending(false),

      mFlushingAudio(NONE),

      mFlushingVideo(NONE),

      mVideoSkipToIFrame(false),

      mResetInProgress(false),

      mResetPostponed(false),

      mSkipRenderingAudioUntilMediaTimeUs(-1ll),

     mSkipRenderingVideoUntilMediaTimeUs(-1ll),

      mVideoLateByUs(0ll),

      mNumFramesTotal(0ll),

      mNumFramesDropped(0ll),

      mPauseIndication(false),

      mSourceType(kDefaultSource),

      mStats(NULL),

      mBufferingNotification(false),

      mSRid(0) {

      mTrackName = new char[6];

}

构成nuplayer的过程,无非就是初始化一些状态,标志位,重要的是起了消息队列,也就是我们上篇写的AHandler消息机制:http://blog.csdn.net/tjy1985/article/details/8063484,我们也不多说了,直接进入setdataSource,先来个概图吧:


setDataSource分三步来走:

1:创建相应的消息

2:根据URL创建对应的source

3:onmessageReceive处理对应的消息

voidNuPlayer::setDataSource(

        const char *url, constKeyedVector<String8, String8> *headers) {

    1sp<AMessage> msg = new AMessage(kWhatSetDataSource, id());----构建一个kWhatSetDataSource的消息

 

    sp<Source> source;

    if (IsHTTPLiveURL(url)) {

     2   source = newHTTPLiveSource(url, headers, mUIDValid, mUID);----创建的HTTPLiveSource

    } else if (!strncasecmp(url,"rtsp://", 7)) {

        source = newRTSPSource(url, headers, mUIDValid, mUID);-----创建RTSPSource实例

    } else {

        source = new GenericSource(url,headers, mUIDValid, mUID);

    }

 

    msg->setObject("source",source);

    msg->post();-----post刚才构建的kWhatSetDataSource消息

}

 

 

voidNuPlayer::onMessageReceived(const sp<AMessage> &msg) {

    switch (msg->what()) {

   3     case kWhatSetDataSource:------------处理kWhatSetDataSource消息

        {

            ALOGV("kWhatSetDataSource");

            CHECK(mSource == NULL);

            sp<RefBase> obj;

           CHECK(msg->findObject("source", &obj));

 

            mSource = static_cast<Source*>(obj.get());

            break;

        }


0 0
原创粉丝点击