android 下硬解码

来源:互联网 发布:易语言全自动挂机源码 编辑:程序博客网 时间:2024/04/29 05:52

OpenMAX Android uses OpenMAX for codec interface. Hence all native codecs (hardware accelerated or otherwise) provide OpenMAX interface. This interface is used by StageFright(Player framework) for decoding media using codec

NDK Android allows Java Applications to interact with underlying C/C++ native libraries using NDK. This requires using JNI (Java Native Interface).

Now coming to your question How to tap native decoder to decode raw video bitstream?

In Android 4.0 version and below, Android did not provide access to underlying video decoders at Java layer. You would need to write native code to directly interact with OMX decoder. Though this is possible, it is not trivial as it would need knowledge of how OMX works and how to map this OMX to application using NDK.

In 4.1 (Jelly Bean version), Android seems to provide access to hardware accelerated decoders at application level through JAVA APIs. More details about new APIs athttp://developer.android.com/about/versions/android-4.1.html#Multimedia

 

There are two ways:

(1) build your project using android full source tree. This way takes a few days to setup, once ready, it's very easy, and you can take full advantage of stagefright.

(2) you can just copy include file to your project, it's inside this folder:

android-4.0.4_r1.1/frameworks/base/include/media/stagefright

then you will have export the library function by dynamically loading libstagefright.so, and you can link with your jni project.

To encode/decode using statgefright, it's very straightforward, a few hundred of lines can will do.

I used stagefright to capture screenshots to create a video, which will be available in our Android VNC server, to be released soon.

the following is a snippet, I think it's better than using ffmpeg to encode a movie. You can add audio source as well.

 

class ImageSource : public MediaSource {   ImageSource(int width, int height, int colorFormat)    : mWidth(width),      mHeight(height),      mColorFormat(colorFormat)   {   }   virtual status_t read(        MediaBuffer **buffer, const MediaSource::ReadOptions *options) {       // here you can fill the buffer with your pixels   }   ...};int width = 720;int height = 480;sp<MediaSource> img_source = new ImageSource(width, height, colorFormat);sp<MetaData> enc_meta = new MetaData;// enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_H263);// enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_MPEG4);enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_AVC);enc_meta->setInt32(kKeyWidth, width);enc_meta->setInt32(kKeyHeight, height);enc_meta->setInt32(kKeySampleRate, kFramerate);enc_meta->setInt32(kKeyBitRate, kVideoBitRate);enc_meta->setInt32(kKeyStride, width);enc_meta->setInt32(kKeySliceHeight, height);enc_meta->setInt32(kKeyIFramesInterval, kIFramesIntervalSec);enc_meta->setInt32(kKeyColorFormat, colorFormat);sp<MediaSource> encoder =    OMXCodec::Create(            client.interface(), enc_meta, true, image_source);sp<MPEG4Writer> writer = new MPEG4Writer("/sdcard/screenshot.mp4");writer->addSource(encoder);// you can add an audio source here if you want to encode audio as well// //sp<MediaSource> audioEncoder =//    OMXCodec::Create(client.interface(), encMetaAudio, true, audioSource);//writer->addSource(audioEncoder);writer->setMaxFileDuration(kDurationUs);CHECK_EQ(OK, writer->start());while (!writer->reachedEOS()) {    fprintf(stderr, ".");    usleep(100000);}err = writer->stop();

 

 

原创粉丝点击