How to Build FFmpeg for Android

来源:互联网 发布:pbb视频提取软件 编辑:程序博客网 时间:2024/05/01 04:25

ffmpeg is an open-source platform for recording, converting, playing and streaming video and audio. It includes libavcodec, a popular video/audio codec.

Several popular Android applications are built based on FFmpeg, including RockPlayer, MoboPlayer, acrMedia, vitalPlayer, V-Cut Express etc. If you’re developing multimedia applications that needs a video/audio codec, ffmpeg is a good choice.

This blog covers how to compile ffmpeg for Android, and next blog will cover how to use ffmpeg to build a simple application.

The steps below are done on Ubuntu 10.10, android NDK r5b, and ffmpeg 0.8. It should work on other versions of Android NDK and ffmpeg, but it may require minor changes.

0. Download Android NDK r5b

You can download the NDK here. Once downloaded, simply extract the file, and you’ll have a folder named android-ndk-r5b. You’ll need the folder location for configurations later.

1. Download Source Code for FFmpeg

You can download the source code from here. If you want to get the latest code, you can use git or svn, the link has detailed  instructions. But for this tutorial, the FFmpeg 0.8 “Love” release is downloaded.

After downloaded the source, extract it and you’ll have a folder named ffmpeg-0.8.

2. Build FFmpeg (The script is based on RockPlayer build script)

2.1 Copy and Paste the bash script from here to a text editor, and save it as build_android.sh under ffmpeg-0.8 folder.

Note that NDK location has to be changed according to your android-ndk-r5b folder location. In my machine, it’s at ~/Desktop/android/, so it’s set as

NDK=~/Desktop/android/android-ndk-r5b
PLATFORM=$NDK/platforms/android-8/arch-arm/
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86

You may also need to adjust the PLATFORM based on which version of SDK you’re using, android-8 corresponds to android SDK 2.2.

The default configuration in the script disables a lot of stuff to speed up the build, you can change the configuration to suit your needs. Besides, you can compile for multiple hardware platforms, but here we only enable arm v7vfpv3 to speed up the build process.

2.2 Make sure the bash script is executable. Go to the ffmpeg-0.8 directory in terminal, then type the following command,

sudo chmod 755 build_android.sh

2.3 Then execute the script, by typing the following command,

./build_android.sh

The compilation will take a while (several minutes or above depends on your machine) to finish.

Update for NDK-r6:

For android NDK-r6, the build_android.sh script might not work. You can try the scripthere.

Note that you may need to create ./android/armv7-a/ folder in the ffmpeg directory yourself. (Thanks tomgg28831 for this).

If you encounter permission denied error, you can try sudo ./build_android.sh.

3. The Output of the Build

Once the script finishes execution, there’ll be a folder called android under ffmpeg-0.8 directory, which contains all the output of the build.

4. To be Continued

Once the library is compiled successfuly, the next step is to use it to build Android apps. This is covered in next blog, How to Build Android Apps Based on FFmpeg By an Example.

Reference:

RockPlayer open source component: http://rockplayer.freecoder.org/tech.html


======================================================================================================

One way that springs to mind is to draw the pixels of your frame into a texture and then render that texture using OpenGL.

I wrote a blog post a while back on how to go about this, primarily for old-skool pixel-based video games, but it also applies for your situation. The post isAndroid Native Coding in C, and I set up a github repository with an example. Using this technique I have been able to get 60 FPS, even on first generation hardware.

EDIT regarding glTexImage2D vs glTexSubImage2D for this approach.

Calling glTexImage2D will allocate video memory for your texture and copy the pixels you pass it into that memory (if you don't pass NULL). Calling glTexSubImage2D will update the pixels you specify in an already-allocated texture.

If you update all of the texture then there's little difference calling one or the other, in fact glTexImage2D is usually faster. But if you only update part of the texture glTexSubImage2D wins out on speed.

You have to use power-of-2 texture sizes, so in covering the screen on hi-res devices requires a 1024x512 texture, and a 512x512 texture on medium resolutions. The texture is larger than the screen area (hi-res is 800x400-ish), which means you only need to update part of it, so glTexSubImage2D is the way to go.

转贴:http://stackoverflow.com/questions/4676178/android-video-player-using-ndk-opengl-es-and-ffmpeg