ffmpeg实战教程(十)ffmpeg/camera实现最近很火的视频壁纸,相机壁纸

来源:互联网 发布:你买过淘宝的福袋吗 编辑:程序博客网 时间:2024/06/06 14:02

本篇实现一个有意思的玩意儿,视频壁纸,相机壁纸
这玩意好像现在还都是国外版本,哈哈

先上图:
视频壁纸
这里写图片描述

相机壁纸
这里写图片描述

1.动态壁纸制作的知识:

每一个动态壁纸都继承自WallpaperService,其中必须实现的抽象方法onCreateEngine,返回一个Engine对象,实际上所有的绘图与刷新都是由engine完成。如下

public class VideoLiveWallpaper extends WallpaperService {    // 实现WallpaperService必须实现的抽象方法      public Engine onCreateEngine() {        return new VideoEngine();    }    class VideoEngine extends Engine {        @Override        public void onCreate(SurfaceHolder surfaceHolder) {            super.onCreate(surfaceHolder);            // 设置处理触摸事件              setTouchEventsEnabled(true);        }          }        }

必须在清单文件中进行一些配置,比如:

<!-- 配置动态壁纸Service --><service android:label="@string/app_name"    android:name=".LiveWallpaper"    android:permission="android.permission.BIND_WALLPAPER">    <!-- 为动态壁纸配置intent-filter -->    <intent-filter>        <action android:name="android.service.wallpaper.WallpaperService" />    </intent-filter>    <!-- 为动态壁纸配置meta-data -->    <meta-data android:name="android.service.wallpaper"        android:resource="@xml/livewallpaper" /></service>
 比较重要的部分首先是权限android:permission=”android.permission.BIND_WALLPAPER”;   其次service需要响应action:android:name=”android.service.wallpaper.WallpaperService; 接下来接收配置文件。首先在res文件夹下建立一个xml目录,和写appwidget一样。在目录下我们创建一个xml文件:
<wallpaper xmlns:android="http://schemas.android.com/apk/res/android"      android:settingsActivity="LiveWallPreference"      android:thumbnail="@drawable/ic_launcher"      android:description="@string/wallpaper_description"      />

然后启动选择壁纸的代码是这样的:

 final Intent pickWallpaper = new Intent(Intent.ACTION_SET_WALLPAPER);        Intent chooser = Intent.createChooser(pickWallpaper, getString(R.string.choose_wallpaper));        startActivity(chooser);

2.相机壁纸:

下面是相机壁纸实现的源码

最精华的一句: camera.setPreviewDisplay(getSurfaceHolder());
直接把相机预览数据传给WallpaperService。

package com.ws.ffmpegandroidwallpaper;import android.hardware.Camera;import android.service.wallpaper.WallpaperService;import android.view.MotionEvent;import android.view.SurfaceHolder;import java.io.IOException;public class CameraLiveWallpaper extends WallpaperService {    public Engine onCreateEngine() {        return new CameraEngine();    }    class CameraEngine extends Engine  {        private Camera camera;        @Override        public void onCreate(SurfaceHolder surfaceHolder) {            super.onCreate(surfaceHolder);            startPreview();            // 设置处理触摸事件              setTouchEventsEnabled(true);        }        @Override        public void onTouchEvent(MotionEvent event) {            super.onTouchEvent(event);         }        @Override        public void onDestroy() {            super.onDestroy();            stopPreview();        }        @Override        public void onVisibilityChanged(boolean visible) {            if (visible) {                startPreview();            } else {                stopPreview();            }        }        /**         * 开始预览         */        public void startPreview() {            camera = Camera.open();            camera.setDisplayOrientation(90);            try {                camera.setPreviewDisplay(getSurfaceHolder());            } catch (IOException e) {                e.printStackTrace();            }            camera.startPreview();        }        /**         * 停止预览         */        public void stopPreview() {            if (camera != null) {                try {                    camera.stopPreview();                    camera.setPreviewCallback(null);                    // camera.lock();                    camera.release();                } catch (Exception e) {                    e.printStackTrace();                }                camera = null;            }        }    }}  

视频壁纸

实现视频壁纸的时候本来打算用mediaplayer实现,后来发现mediaplayer实现在某些机型上报JNI层错误。
于是改用ffmpeg自己实现JNI层,当然这样做的好处是可以更多的定制化,比如示例上的快速播放视频。

主要就一个函数
即把WallpaperService 的Surface传给native的play方法。

package com.ws.ffmpegandroidwallpaper;import android.os.Handler;import android.service.wallpaper.WallpaperService;import android.view.MotionEvent;import android.view.SurfaceHolder;public class VideoLiveWallpaper extends WallpaperService {    // 实现WallpaperService必须实现的抽象方法      public Engine onCreateEngine() {        return new VideoEngine();    }    class VideoEngine extends Engine {        @Override        public void onCreate(SurfaceHolder surfaceHolder) {            super.onCreate(surfaceHolder);                    play(getSurfaceHolder().getSurface());            // 设置处理触摸事件              setTouchEventsEnabled(true);        }    static {        System.loadLibrary("native-lib");    }    public native int play(Object surface);}

然后JNI的play方法具体实现。

关键地方都有注释,可以结合我之前分享的ffmpeg源码看
ffmpeg源码简析(一)结构总览 :http://blog.csdn.net/king1425/article/details/70597642

JNIEXPORT jint JNICALLJava_com_ws_ffmpegandroidwallpaper_VideoLiveWallpaper_play        (JNIEnv *env, jclass clazz, jobject surface) {    LOGD("play");    // sd卡中的视频文件地址,可自行修改或者通过jni传入    //char *file_name = "/storage/emulated/0/ws.mp4";    char *file_name = "/storage/emulated/0/video.avi";    av_register_all();    AVFormatContext *pFormatCtx = avformat_alloc_context();    // Open video file    if (avformat_open_input(&pFormatCtx, file_name, NULL, NULL) != 0) {        LOGD("Couldn't open file:%s\n", file_name);        return -1; // Couldn't open file    }    // Retrieve stream information    if (avformat_find_stream_info(pFormatCtx, NULL) < 0) {        LOGD("Couldn't find stream information.");        return -1;    }    // Find the first video stream    int videoStream = -1, i;    for (i = 0; i < pFormatCtx->nb_streams; i++) {        if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO            && videoStream < 0) {            videoStream = i;        }    }    if (videoStream == -1) {        LOGD("Didn't find a video stream.");        return -1; // Didn't find a video stream    }    // Get a pointer to the codec context for the video stream    AVCodecContext *pCodecCtx = pFormatCtx->streams[videoStream]->codec;    // Find the decoder for the video stream    AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);    if (pCodec == NULL) {        LOGD("Codec not found.");        return -1; // Codec not found    }    if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {        LOGD("Could not open codec.");        return -1; // Could not open codec    }    // 获取native window    ANativeWindow *nativeWindow = ANativeWindow_fromSurface(env, surface);    // 获取视频宽高    int videoWidth = pCodecCtx->width;    int videoHeight = pCodecCtx->height;    // 设置native window的buffer大小,可自动拉伸    ANativeWindow_setBuffersGeometry(nativeWindow, videoWidth, videoHeight,                                     WINDOW_FORMAT_RGBA_8888);    ANativeWindow_Buffer windowBuffer;    if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {        LOGD("Could not open codec.");        return -1; // Could not open codec    }    // Allocate video frame    AVFrame *pFrame = av_frame_alloc();    // 用于渲染    AVFrame *pFrameRGBA = av_frame_alloc();    if (pFrameRGBA == NULL || pFrame == NULL) {        LOGD("Could not allocate video frame.");        return -1;    }    // Determine required buffer size and allocate buffer    // buffer中数据就是用于渲染的,且格式为RGBA    int numBytes = av_image_get_buffer_size(AV_PIX_FMT_RGBA, pCodecCtx->width, pCodecCtx->height,                                            1);    uint8_t *buffer = (uint8_t *) av_malloc(numBytes * sizeof(uint8_t));    av_image_fill_arrays(pFrameRGBA->data, pFrameRGBA->linesize, buffer, AV_PIX_FMT_RGBA,                         pCodecCtx->width, pCodecCtx->height, 1);    // 由于解码出来的帧格式不是RGBA的,在渲染之前需要进行格式转换    struct SwsContext *sws_ctx = sws_getContext(pCodecCtx->width,                                                pCodecCtx->height,                                                pCodecCtx->pix_fmt,                                                pCodecCtx->width,                                                pCodecCtx->height,                                                AV_PIX_FMT_RGBA,                                                SWS_BILINEAR,                                                NULL,                                                NULL,                                                NULL);    int frameFinished;    AVPacket packet;    while (av_read_frame(pFormatCtx, &packet) >= 0) {        // Is this a packet from the video stream?        if (packet.stream_index == videoStream) {            // Decode video frame            avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);            // 并不是decode一次就可解码出一帧            if (frameFinished) {                // lock native window buffer                ANativeWindow_lock(nativeWindow, &windowBuffer, 0);                // 格式转换                sws_scale(sws_ctx, (uint8_t const *const *) pFrame->data,                          pFrame->linesize, 0, pCodecCtx->height,                          pFrameRGBA->data, pFrameRGBA->linesize);                // 获取stride                uint8_t *dst = (uint8_t *) windowBuffer.bits;                int dstStride = windowBuffer.stride * 4;                uint8_t *src = (pFrameRGBA->data[0]);                int srcStride = pFrameRGBA->linesize[0];                // 由于window的stride和帧的stride不同,因此需要逐行复制                int h;                for (h = 0; h < videoHeight; h++) {                    memcpy(dst + h * dstStride, src + h * srcStride, srcStride);                }                ANativeWindow_unlockAndPost(nativeWindow);            }        }        av_packet_unref(&packet);    }    av_free(buffer);    av_free(pFrameRGBA);    // Free the YUV frame    av_free(pFrame);    // Close the codecs    avcodec_close(pCodecCtx);    // Close the video file    avformat_close_input(&pFormatCtx);    return 0;}}

demo :https://github.com/WangShuo1143368701/FFmpegAndroid/tree/master/ffmpegandroidwallpaper

原创粉丝点击