用jni实现基于opengl的yuv格式的视频渲染
来源:互联网 发布:udp和tcp的端口号 编辑:程序博客网 时间:2024/05/19 19:30
由于项目需要,需要在android上面实现视频流的解码显示,综合考虑决定使用ffmpeg解码,opengl渲染视频。
技术选型确定以后,开始写demo,不做不知道,一做才发现网上的东西太不靠谱了,基于jni实现的opengl不是直接渲染yuv格式的数据,都是yuv转rgb以后在显示的,有实现的资料都是在java层做的,我不是java出生,所以对那个不感冒,综合考虑之后决定自己通过jni来实现,由于以前基于webrtc开发了一款产品,用的是webrtc的c++接口开发(现在的webrtc都基于浏览器开发了,更加成熟了,接口也更加简单,^_^我觉得还是挖c++代码出来自己实现接口层有意思,我那个项目就是这样搞的),废话不多说,开始讲述实现步骤。
注意:android2.3.3版本才开始支持opengl。
写jni的时候需要在Android.mk里面加上opengl的库连接,这里我发一个我的Android.mk出来供大家参考一下:
- LOCAL_PATH := $(call my-dir)
- MY_LIBS_PATH := /Users/chenjianjun/Documents/work/ffmpeg-android/build/lib
- MY_INCLUDE_PATH := /Users/chenjianjun/Documents/work/ffmpeg-android/build/include
- include $(CLEAR_VARS)
- LOCAL_MODULE := libavcodec
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libavcodec.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE := libavfilter
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libavfilter.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE := libavformat
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libavformat.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE := libavresample
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libavresample.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE := libavutil
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libavutil.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE := libpostproc
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libpostproc.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE := libswresample
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libswresample.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE := libswscale
- LOCAL_SRC_FILES := $(MY_LIBS_PATH)/libswscale.a
- include $(PREBUILT_STATIC_LIBRARY)
- include $(CLEAR_VARS)
- LOCAL_MODULE_TAGS := MICloudPub
- LOCAL_MODULE := libMICloudPub
- LOCAL_SRC_FILES := H264Decoder.cpp \ #我的H264基于ffmpeg的解码接口代码
- render_opengles20.cpp \ #opengl的渲染代码
- test.cpp #测试接口代码
- LOCAL_CFLAGS :=
- LOCAL_C_INCLUDES := $(MY_INCLUDE_PATH)
- LOCAL_CPP_INCLUDES := $(MY_INCLUDE_PATH)
- LOCAL_LDLIBS := \
- -llog \
- -lgcc \
- <span style="font-size:32px;color:#FF0000;">-lGLESv2 \</span>
- -lz
- LOCAL_WHOLE_STATIC_LIBRARIES := \
- libavcodec \
- libavfilter \
- libavformat \
- libavresample \
- libavutil \
- libpostproc \
- libswresample \
- libswscale
- include $(BUILD_SHARED_LIBRARY)
第一步:
写java代码(主要是为了jni里面的代码回调java的代码实现,其中的妙用大家后面便知)
我把webrtc里面的代码拿出来改动了一下,就没自己去写了(不用重复造轮子)
ViEAndroidGLES20.java
- package hzcw.opengl;
- import java.util.concurrent.locks.ReentrantLock;
- import javax.microedition.khronos.egl.EGL10;
- import javax.microedition.khronos.egl.EGLConfig;
- import javax.microedition.khronos.egl.EGLContext;
- import javax.microedition.khronos.egl.EGLDisplay;
- import javax.microedition.khronos.opengles.GL10;
- import android.app.ActivityManager;
- import android.content.Context;
- import android.content.pm.ConfigurationInfo;
- import android.graphics.PixelFormat;
- import android.opengl.GLSurfaceView;
- import android.util.Log;
- public class ViEAndroidGLES20 extends GLSurfaceView implements GLSurfaceView.Renderer
- {
- private static String TAG = "MICloudPub";
- private static final boolean DEBUG = false;
- // True if onSurfaceCreated has been called.
- private boolean surfaceCreated = false;
- private boolean openGLCreated = false;
- // True if NativeFunctionsRegistered has been called.
- private boolean nativeFunctionsRegisted = false;
- private ReentrantLock nativeFunctionLock = new ReentrantLock();
- // Address of Native object that will do the drawing.
- private long nativeObject = 0;
- private int viewWidth = 0;
- private int viewHeight = 0;
- public static boolean UseOpenGL2(Object renderWindow) {
- return ViEAndroidGLES20.class.isInstance(renderWindow);
- }
- public ViEAndroidGLES20(Context context) {
- super(context);
- init(false, 0, 0);
- }
- public ViEAndroidGLES20(Context context, boolean translucent,
- int depth, int stencil) {
- super(context);
- init(translucent, depth, stencil);
- }
- private void init(boolean translucent, int depth, int stencil) {
- // By default, GLSurfaceView() creates a RGB_565 opaque surface.
- // If we want a translucent one, we should change the surface's
- // format here, using PixelFormat.TRANSLUCENT for GL Surfaces
- // is interpreted as any 32-bit surface with alpha by SurfaceFlinger.
- if (translucent) {
- this.getHolder().setFormat(PixelFormat.TRANSLUCENT);
- }
- // Setup the context factory for 2.0 rendering.
- // See ContextFactory class definition below
- setEGLContextFactory(new ContextFactory());
- // We need to choose an EGLConfig that matches the format of
- // our surface exactly. This is going to be done in our
- // custom config chooser. See ConfigChooser class definition
- // below.
- setEGLConfigChooser( translucent ?
- new ConfigChooser(8, 8, 8, 8, depth, stencil) :
- new ConfigChooser(5, 6, 5, 0, depth, stencil) );
- // Set the renderer responsible for frame rendering
- this.setRenderer(this);
- this.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
- }
- private static class ContextFactory implements GLSurfaceView.EGLContextFactory {
- private static int EGL_CONTEXT_CLIENT_VERSION = 0x3098;
- public EGLContext createContext(EGL10 egl, EGLDisplay display, EGLConfig eglConfig) {
- Log.w(TAG, "creating OpenGL ES 2.0 context");
- checkEglError("Before eglCreateContext", egl);
- int[] attrib_list = {EGL_CONTEXT_CLIENT_VERSION, 2, EGL10.EGL_NONE };
- EGLContext context = egl.eglCreateContext(display, eglConfig,
- EGL10.EGL_NO_CONTEXT, attrib_list);
- checkEglError("After eglCreateContext", egl);
- return context;
- }
- public void destroyContext(EGL10 egl, EGLDisplay display, EGLContext context) {
- egl.eglDestroyContext(display, context);
- }
- }
- private static void checkEglError(String prompt, EGL10 egl) {
- int error;
- while ((error = egl.eglGetError()) != EGL10.EGL_SUCCESS) {
- Log.e(TAG, String.format("%s: EGL error: 0x%x", prompt, error));
- }
- }
- private static class ConfigChooser implements GLSurfaceView.EGLConfigChooser {
- public ConfigChooser(int r, int g, int b, int a, int depth, int stencil) {
- mRedSize = r;
- mGreenSize = g;
- mBlueSize = b;
- mAlphaSize = a;
- mDepthSize = depth;
- mStencilSize = stencil;
- }
- // This EGL config specification is used to specify 2.0 rendering.
- // We use a minimum size of 4 bits for red/green/blue, but will
- // perform actual matching in chooseConfig() below.
- private static int EGL_OPENGL_ES2_BIT = 4;
- private static int[] s_configAttribs2 =
- {
- EGL10.EGL_RED_SIZE, 4,
- EGL10.EGL_GREEN_SIZE, 4,
- EGL10.EGL_BLUE_SIZE, 4,
- EGL10.EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
- EGL10.EGL_NONE
- };
- public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display) {
- // Get the number of minimally matching EGL configurations
- int[] num_config = new int[1];
- egl.eglChooseConfig(display, s_configAttribs2, null, 0, num_config);
- int numConfigs = num_config[0];
- if (numConfigs <= 0) {
- throw new IllegalArgumentException("No configs match configSpec");
- }
- // Allocate then read the array of minimally matching EGL configs
- EGLConfig[] configs = new EGLConfig[numConfigs];
- egl.eglChooseConfig(display, s_configAttribs2, configs, numConfigs, num_config);
- if (DEBUG) {
- printConfigs(egl, display, configs);
- }
- // Now return the "best" one
- return chooseConfig(egl, display, configs);
- }
- public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display,
- EGLConfig[] configs) {
- for(EGLConfig config : configs) {
- int d = findConfigAttrib(egl, display, config,
- EGL10.EGL_DEPTH_SIZE, 0);
- int s = findConfigAttrib(egl, display, config,
- EGL10.EGL_STENCIL_SIZE, 0);
- // We need at least mDepthSize and mStencilSize bits
- if (d < mDepthSize || s < mStencilSize)
- continue;
- // We want an *exact* match for red/green/blue/alpha
- int r = findConfigAttrib(egl, display, config,
- EGL10.EGL_RED_SIZE, 0);
- int g = findConfigAttrib(egl, display, config,
- EGL10.EGL_GREEN_SIZE, 0);
- int b = findConfigAttrib(egl, display, config,
- EGL10.EGL_BLUE_SIZE, 0);
- int a = findConfigAttrib(egl, display, config,
- EGL10.EGL_ALPHA_SIZE, 0);
- if (r == mRedSize && g == mGreenSize && b == mBlueSize && a == mAlphaSize)
- return config;
- }
- return null;
- }
- private int findConfigAttrib(EGL10 egl, EGLDisplay display,
- EGLConfig config, int attribute, int defaultValue) {
- if (egl.eglGetConfigAttrib(display, config, attribute, mValue)) {
- return mValue[0];
- }
- return defaultValue;
- }
- private void printConfigs(EGL10 egl, EGLDisplay display,
- EGLConfig[] configs) {
- int numConfigs = configs.length;
- Log.w(TAG, String.format("%d configurations", numConfigs));
- for (int i = 0; i < numConfigs; i++) {
- Log.w(TAG, String.format("Configuration %d:\n", i));
- printConfig(egl, display, configs[i]);
- }
- }
- private void printConfig(EGL10 egl, EGLDisplay display,
- EGLConfig config) {
- int[] attributes = {
- EGL10.EGL_BUFFER_SIZE,
- EGL10.EGL_ALPHA_SIZE,
- EGL10.EGL_BLUE_SIZE,
- EGL10.EGL_GREEN_SIZE,
- EGL10.EGL_RED_SIZE,
- EGL10.EGL_DEPTH_SIZE,
- EGL10.EGL_STENCIL_SIZE,
- EGL10.EGL_CONFIG_CAVEAT,
- EGL10.EGL_CONFIG_ID,
- EGL10.EGL_LEVEL,
- EGL10.EGL_MAX_PBUFFER_HEIGHT,
- EGL10.EGL_MAX_PBUFFER_PIXELS,
- EGL10.EGL_MAX_PBUFFER_WIDTH,
- EGL10.EGL_NATIVE_RENDERABLE,
- EGL10.EGL_NATIVE_VISUAL_ID,
- EGL10.EGL_NATIVE_VISUAL_TYPE,
- 0x3030, // EGL10.EGL_PRESERVED_RESOURCES,
- EGL10.EGL_SAMPLES,
- EGL10.EGL_SAMPLE_BUFFERS,
- EGL10.EGL_SURFACE_TYPE,
- EGL10.EGL_TRANSPARENT_TYPE,
- EGL10.EGL_TRANSPARENT_RED_VALUE,
- EGL10.EGL_TRANSPARENT_GREEN_VALUE,
- EGL10.EGL_TRANSPARENT_BLUE_VALUE,
- 0x3039, // EGL10.EGL_BIND_TO_TEXTURE_RGB,
- 0x303A, // EGL10.EGL_BIND_TO_TEXTURE_RGBA,
- 0x303B, // EGL10.EGL_MIN_SWAP_INTERVAL,
- 0x303C, // EGL10.EGL_MAX_SWAP_INTERVAL,
- EGL10.EGL_LUMINANCE_SIZE,
- EGL10.EGL_ALPHA_MASK_SIZE,
- EGL10.EGL_COLOR_BUFFER_TYPE,
- EGL10.EGL_RENDERABLE_TYPE,
- 0x3042 // EGL10.EGL_CONFORMANT
- };
- String[] names = {
- "EGL_BUFFER_SIZE",
- "EGL_ALPHA_SIZE",
- "EGL_BLUE_SIZE",
- "EGL_GREEN_SIZE",
- "EGL_RED_SIZE",
- "EGL_DEPTH_SIZE",
- "EGL_STENCIL_SIZE",
- "EGL_CONFIG_CAVEAT",
- "EGL_CONFIG_ID",
- "EGL_LEVEL",
- "EGL_MAX_PBUFFER_HEIGHT",
- "EGL_MAX_PBUFFER_PIXELS",
- "EGL_MAX_PBUFFER_WIDTH",
- "EGL_NATIVE_RENDERABLE",
- "EGL_NATIVE_VISUAL_ID",
- "EGL_NATIVE_VISUAL_TYPE",
- "EGL_PRESERVED_RESOURCES",
- "EGL_SAMPLES",
- "EGL_SAMPLE_BUFFERS",
- "EGL_SURFACE_TYPE",
- "EGL_TRANSPARENT_TYPE",
- "EGL_TRANSPARENT_RED_VALUE",
- "EGL_TRANSPARENT_GREEN_VALUE",
- "EGL_TRANSPARENT_BLUE_VALUE",
- "EGL_BIND_TO_TEXTURE_RGB",
- "EGL_BIND_TO_TEXTURE_RGBA",
- "EGL_MIN_SWAP_INTERVAL",
- "EGL_MAX_SWAP_INTERVAL",
- "EGL_LUMINANCE_SIZE",
- "EGL_ALPHA_MASK_SIZE",
- "EGL_COLOR_BUFFER_TYPE",
- "EGL_RENDERABLE_TYPE",
- "EGL_CONFORMANT"
- };
- int[] value = new int[1];
- for (int i = 0; i < attributes.length; i++) {
- int attribute = attributes[i];
- String name = names[i];
- if (egl.eglGetConfigAttrib(display, config, attribute, value)) {
- Log.w(TAG, String.format(" %s: %d\n", name, value[0]));
- } else {
- // Log.w(TAG, String.format(" %s: failed\n", name));
- while (egl.eglGetError() != EGL10.EGL_SUCCESS);
- }
- }
- }
- // Subclasses can adjust these values:
- protected int mRedSize;
- protected int mGreenSize;
- protected int mBlueSize;
- protected int mAlphaSize;
- protected int mDepthSize;
- protected int mStencilSize;
- private int[] mValue = new int[1];
- }
- // IsSupported
- // Return true if this device support Open GL ES 2.0 rendering.
- public static boolean IsSupported(Context context) {
- ActivityManager am =
- (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);
- ConfigurationInfo info = am.getDeviceConfigurationInfo();
- if(info.reqGlEsVersion >= 0x20000) {
- // Open GL ES 2.0 is supported.
- return true;
- }
- return false;
- }
- public void onDrawFrame(GL10 gl) {
- nativeFunctionLock.lock();
- if(!nativeFunctionsRegisted || !surfaceCreated) {
- nativeFunctionLock.unlock();
- return;
- }
- if(!openGLCreated) {
- if(0 != CreateOpenGLNative(nativeObject, viewWidth, viewHeight)) {
- return; // Failed to create OpenGL
- }
- openGLCreated = true; // Created OpenGL successfully
- }
- DrawNative(nativeObject); // Draw the new frame
- nativeFunctionLock.unlock();
- }
- public void onSurfaceChanged(GL10 gl, int width, int height) {
- surfaceCreated = true;
- viewWidth = width;
- viewHeight = height;
- nativeFunctionLock.lock();
- if(nativeFunctionsRegisted) {
- if(CreateOpenGLNative(nativeObject,width,height) == 0)
- openGLCreated = true;
- }
- nativeFunctionLock.unlock();
- }
- public void onSurfaceCreated(GL10 gl, EGLConfig config) {
- }
- public void RegisterNativeObject(long nativeObject) {
- nativeFunctionLock.lock();
- this.nativeObject = nativeObject;
- nativeFunctionsRegisted = true;
- nativeFunctionLock.unlock();
- }
- public void DeRegisterNativeObject() {
- nativeFunctionLock.lock();
- nativeFunctionsRegisted = false;
- openGLCreated = false;
- this.nativeObject = 0;
- nativeFunctionLock.unlock();
- }
- public void ReDraw() {// jni层解码以后的数据回调,然后由系统调用onDrawFrame显示
- if(surfaceCreated) {
- // Request the renderer to redraw using the render thread context.
- this.requestRender();
- }
- }
- private native int CreateOpenGLNative(long nativeObject, int width, int height);
- private native void DrawNative(long nativeObject);
- }
- package hzcw.opengl;
- import android.content.Context;
- import android.view.SurfaceView;
- public class ViERenderer
- {
- public static SurfaceView CreateRenderer(Context context) {
- return CreateRenderer(context, false);
- }
- public static SurfaceView CreateRenderer(Context context,
- boolean useOpenGLES2) {
- if(useOpenGLES2 == true && ViEAndroidGLES20.IsSupported(context))
- return new ViEAndroidGLES20(context);
- else
- return null;
- }
- }
GL2JNILib.java (native接口代码)
- package com.example.filltriangle;
- public class GL2JNILib {
- static {
- System.loadLibrary("MICloudPub");
- }
- /**
- */
- public static native void init(Object glSurface);
- public static native void step(String filepath);
- }
第二步:写jni代码
com_example_filltriangle_GL2JNILib.h (javah自动生成的)
- /* DO NOT EDIT THIS FILE - it is machine generated */
- #include <jni.h>
- /* Header for class com_example_filltriangle_GL2JNILib */
- #ifndef _Included_com_example_filltriangle_GL2JNILib
- #define _Included_com_example_filltriangle_GL2JNILib
- #ifdef __cplusplus
- extern "C" {
- #endif
- /*
- * Class: com_example_filltriangle_GL2JNILib
- * Method: init
- * Signature: (II)V
- */
- JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_init
- (JNIEnv *, jclass, jobject);
- /*
- * Class: com_example_filltriangle_GL2JNILib
- * Method: step
- * Signature: ()V
- */
- JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_step
- (JNIEnv *, jclass, jstring);
- #ifdef __cplusplus
- }
- #endif
- #endif
test.cpp
- #include <jni.h>
- #include <stdlib.h>
- #include <stdio.h>
- #include "render_opengles20.h"
- #include "com_example_filltriangle_GL2JNILib.h"
- #include "H264Decoder.h"
- class AndroidNativeOpenGl2Channel
- {
- public:
- AndroidNativeOpenGl2Channel(JavaVM* jvm,
- void* window)
- {
- _jvm = jvm;
- _ptrWindow = window;
- _buffer = (uint8_t*)malloc(1024000);
- }
- ~AndroidNativeOpenGl2Channel()
- {
- if (_jvm)
- {
- bool isAttached = false;
- JNIEnv* env = NULL;
- if (_jvm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
- // try to attach the thread and get the env
- // Attach this thread to JVM
- jint res = _jvm->AttachCurrentThread(&env, NULL);
- // Get the JNI env for this thread
- if ((res < 0) || !env) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not attach thread to JVM (%d, %p)",
- __FUNCTION__, res, env);
- env = NULL;
- } else {
- isAttached = true;
- }
- }
- if (env && _deRegisterNativeCID) {
- env->CallVoidMethod(_javaRenderObj, _deRegisterNativeCID);
- }
- env->DeleteGlobalRef(_javaRenderObj);
- env->DeleteGlobalRef(_javaRenderClass);
- if (isAttached) {
- if (_jvm->DetachCurrentThread() < 0) {
- WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
- "%s: Could not detach thread from JVM",
- __FUNCTION__);
- }
- }
- }
- free(_buffer);
- }
- int32_t Init()
- {
- if (!_ptrWindow)
- {
- WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
- "(%s): No window have been provided.", __FUNCTION__);
- return -1;
- }
- if (!_jvm)
- {
- WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
- "(%s): No JavaVM have been provided.", __FUNCTION__);
- return -1;
- }
- // get the JNI env for this thread
- bool isAttached = false;
- JNIEnv* env = NULL;
- if (_jvm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
- // try to attach the thread and get the env
- // Attach this thread to JVM
- jint res = _jvm->AttachCurrentThread(&env, NULL);
- // Get the JNI env for this thread
- if ((res < 0) || !env) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not attach thread to JVM (%d, %p)",
- __FUNCTION__, res, env);
- return -1;
- }
- isAttached = true;
- }
- // get the ViEAndroidGLES20 class
- jclass javaRenderClassLocal = reinterpret_cast<jclass> (env->FindClass("hzcw/opengl/ViEAndroidGLES20"));
- if (!javaRenderClassLocal) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: could not find ViEAndroidGLES20", __FUNCTION__);
- return -1;
- }
- _javaRenderClass = reinterpret_cast<jclass> (env->NewGlobalRef(javaRenderClassLocal));
- if (!_javaRenderClass) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: could not create Java SurfaceHolder class reference",
- __FUNCTION__);
- return -1;
- }
- // Delete local class ref, we only use the global ref
- env->DeleteLocalRef(javaRenderClassLocal);
- jmethodID cidUseOpenGL = env->GetStaticMethodID(_javaRenderClass,
- "UseOpenGL2",
- "(Ljava/lang/Object;)Z");
- if (cidUseOpenGL == NULL) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, -1,
- "%s: could not get UseOpenGL ID", __FUNCTION__);
- return false;
- }
- jboolean res = env->CallStaticBooleanMethod(_javaRenderClass,
- cidUseOpenGL, (jobject) _ptrWindow);
- // create a reference to the object (to tell JNI that we are referencing it
- // after this function has returned)
- _javaRenderObj = reinterpret_cast<jobject> (env->NewGlobalRef((jobject)_ptrWindow));
- if (!_javaRenderObj)
- {
- WEBRTC_TRACE(
- kTraceError,
- kTraceVideoRenderer,
- _id,
- "%s: could not create Java SurfaceRender object reference",
- __FUNCTION__);
- return -1;
- }
- // get the method ID for the ReDraw function
- _redrawCid = env->GetMethodID(_javaRenderClass, "ReDraw", "()V");
- if (_redrawCid == NULL) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: could not get ReDraw ID", __FUNCTION__);
- return -1;
- }
- _registerNativeCID = env->GetMethodID(_javaRenderClass,
- "RegisterNativeObject", "(J)V");
- if (_registerNativeCID == NULL) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: could not get RegisterNativeObject ID", __FUNCTION__);
- return -1;
- }
- _deRegisterNativeCID = env->GetMethodID(_javaRenderClass,
- "DeRegisterNativeObject", "()V");
- if (_deRegisterNativeCID == NULL) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: could not get DeRegisterNativeObject ID",
- __FUNCTION__);
- return -1;
- }
- JNINativeMethod nativeFunctions[2] = {
- { "DrawNative",
- "(J)V",
- (void*) &AndroidNativeOpenGl2Channel::DrawNativeStatic, },
- { "CreateOpenGLNative",
- "(JII)I",
- (void*) &AndroidNativeOpenGl2Channel::CreateOpenGLNativeStatic },
- };
- if (env->RegisterNatives(_javaRenderClass, nativeFunctions, 2) == 0) {
- WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, -1,
- "%s: Registered native functions", __FUNCTION__);
- }
- else {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, -1,
- "%s: Failed to register native functions", __FUNCTION__);
- return -1;
- }
- env->CallVoidMethod(_javaRenderObj, _registerNativeCID, (jlong) this);
- if (isAttached) {
- if (_jvm->DetachCurrentThread() < 0) {
- WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
- "%s: Could not detach thread from JVM", __FUNCTION__);
- }
- }
- WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, "%s done",
- __FUNCTION__);
- // if (_openGLRenderer.SetCoordinates(zOrder, left, top, right, bottom) != 0) {
- // return -1;
- // }
- return 0;
- }
- void DeliverFrame(int32_t widht, int32_t height)
- {
- if (_jvm)
- {
- bool isAttached = false;
- JNIEnv* env = NULL;
- if (_jvm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
- // try to attach the thread and get the env
- // Attach this thread to JVM
- jint res = _jvm->AttachCurrentThread(&env, NULL);
- // Get the JNI env for this thread
- if ((res < 0) || !env) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not attach thread to JVM (%d, %p)",
- __FUNCTION__, res, env);
- env = NULL;
- } else {
- isAttached = true;
- }
- }
- if (env && _redrawCid)
- {
- _widht = widht;
- _height = height;
- env->CallVoidMethod(_javaRenderObj, _redrawCid);
- }
- if (isAttached) {
- if (_jvm->DetachCurrentThread() < 0) {
- WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
- "%s: Could not detach thread from JVM",
- __FUNCTION__);
- }
- }
- }
- }
- void GetDataBuf(uint8_t*& pbuf, int32_t& isize)
- {
- pbuf = _buffer;
- isize = 1024000;
- }
- static jint CreateOpenGLNativeStatic(JNIEnv * env,
- jobject,
- jlong context,
- jint width,
- jint height)
- {
- AndroidNativeOpenGl2Channel* renderChannel =
- reinterpret_cast<AndroidNativeOpenGl2Channel*> (context);
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "%s:", __FUNCTION__);
- return renderChannel->CreateOpenGLNative(width, height);
- }
- static void DrawNativeStatic(JNIEnv * env,jobject, jlong context)
- {
- AndroidNativeOpenGl2Channel* renderChannel =
- reinterpret_cast<AndroidNativeOpenGl2Channel*>(context);
- renderChannel->DrawNative();
- }
- jint CreateOpenGLNative(int width, int height)
- {
- return _openGLRenderer.Setup(width, height);
- }
- void DrawNative()
- {
- _openGLRenderer.Render(_buffer, _widht, _height);
- }
- private:
- JavaVM* _jvm;
- void* _ptrWindow;
- jobject _javaRenderObj;
- jclass _javaRenderClass;
- JNIEnv* _javaRenderJniEnv;
- jmethodID _redrawCid;
- jmethodID _registerNativeCID;
- jmethodID _deRegisterNativeCID;
- RenderOpenGles20 _openGLRenderer;
- uint8_t* _buffer;
- int32_t _widht;
- int32_t _height;
- };
- static JavaVM* g_jvm = NULL;
- static AndroidNativeOpenGl2Channel* p_opengl_channel = NULL;
- extern "C"
- {
- JNIEXPORT jint JNI_OnLoad(JavaVM* vm, void *reserved)
- {
- JNIEnv* env = NULL;
- jint result = -1;
- if (vm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK)
- return -1;
- g_jvm = vm;
- return JNI_VERSION_1_4;
- }
- }
- extern "C"
- {
- int mTrans = 0x0F0F0F0F;
- int MergeBuffer(uint8_t *NalBuf, int NalBufUsed, uint8_t *SockBuf, int SockBufUsed, int SockRemain)
- {
- //把读取的数剧分割成NAL块
- int i = 0;
- char Temp;
- for (i = 0; i < SockRemain; i++) {
- Temp = SockBuf[i + SockBufUsed];
- NalBuf[i + NalBufUsed] = Temp;
- mTrans <<= 8;
- mTrans |= Temp;
- if (mTrans == 1) // 找到一个开始字
- {
- i++;
- break;
- }
- }
- return i;
- }
- JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_init
- (JNIEnv *env, jclass oclass, jobject glSurface)
- {
- if (p_opengl_channel)
- {
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "初期化失败[%d].", __LINE__);
- return;
- }
- p_opengl_channel = new AndroidNativeOpenGl2Channel(g_jvm, glSurface);
- if (p_opengl_channel->Init() != 0)
- {
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "初期化失败[%d].", __LINE__);
- return;
- }
- }
- JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_step(JNIEnv* env, jclass tis, jstring filepath)
- {
- const char *filename = env->GetStringUTFChars(filepath, NULL);
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "step[%d].", __LINE__);
- FILE *_imgFileHandle = fopen(filename, "rb");
- if (_imgFileHandle == NULL)
- {
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "File No Exist[%s][%d].", filename, __LINE__);
- return;
- }
- H264Decoder* pMyH264 = new H264Decoder();
- X264_DECODER_H handle = pMyH264->X264Decoder_Init();
- if (handle <= 0)
- {
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "X264Decoder_Init Error[%d].", __LINE__);
- return;
- }
- int iTemp = 0;
- int nalLen;
- int bytesRead = 0;
- int NalBufUsed = 0;
- int SockBufUsed = 0;
- bool bFirst = true;
- bool bFindPPS = true;
- uint8_t *SockBuf = (uint8_t *)malloc(204800);
- uint8_t *NalBuf = (uint8_t *)malloc(4098000);
- int nWidth, nHeight;
- memset(SockBuf, 0, 204800);
- uint8_t *buffOut = NULL;
- int outSize = 0;
- p_opengl_channel->GetDataBuf(buffOut, outSize);
- uint8_t *IIBuf = (uint8_t *)malloc(204800);
- int IILen = 0;
- do {
- bytesRead = fread(SockBuf, 1, 204800, _imgFileHandle);
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "bytesRead = %d", bytesRead);
- if (bytesRead <= 0) {
- break;
- }
- SockBufUsed = 0;
- while (bytesRead - SockBufUsed > 0) {
- nalLen = MergeBuffer(NalBuf, NalBufUsed, SockBuf, SockBufUsed,
- bytesRead - SockBufUsed);
- NalBufUsed += nalLen;
- SockBufUsed += nalLen;
- while (mTrans == 1) {
- mTrans = 0xFFFFFFFF;
- if (bFirst == true) // the first start flag
- {
- bFirst = false;
- }
- else // a complete NAL data, include 0x00000001 trail.
- {
- if (bFindPPS == true) // true
- {
- if ((NalBuf[4] & 0x1F) == 7 || (NalBuf[4] & 0x1F) == 8)
- {
- bFindPPS = false;
- }
- else
- {
- NalBuf[0] = 0;
- NalBuf[1] = 0;
- NalBuf[2] = 0;
- NalBuf[3] = 1;
- NalBufUsed = 4;
- break;
- }
- }
- if (NalBufUsed == 16 || NalBufUsed == 10 || NalBufUsed == 54 || NalBufUsed == 12 || NalBufUsed == 20) {
- memcpy(IIBuf + IILen, NalBuf, NalBufUsed);
- IILen += NalBufUsed;
- }
- else
- {
- memcpy(IIBuf + IILen, NalBuf, NalBufUsed);
- IILen += NalBufUsed;
- // decode nal
- iTemp = pMyH264->X264Decoder_Decode(handle, (uint8_t *)IIBuf,
- IILen, (uint8_t *)buffOut,
- outSize, &nWidth, &nHeight);
- if (iTemp == 0) {
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "解码成功,宽度:%d高度:%d,解码数据长度:%d.", nWidth, nHeight, iTemp);
- // [self.glView setVideoSize:nWidth height:nHeight];
- // [self.glView displayYUV420pData:buffOut
- // width:nWidth
- // height:nHeight];
- p_opengl_channel->DeliverFrame(nWidth, nHeight);
- }
- else
- {
- WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, "解码失败.");
- }
- IILen = 0;
- }
- }
- NalBuf[0]=0;
- NalBuf[1]=0;
- NalBuf[2]=0;
- NalBuf[3]=1;
- NalBufUsed=4;
- }
- }
- }while (bytesRead>0);
- fclose(_imgFileHandle);
- pMyH264->X264Decoder_UnInit(handle);
- free(SockBuf);
- free(NalBuf);
- delete pMyH264;
- env->ReleaseStringUTFChars(filepath, filename);
- }
- }
render_opengles20.cpp
- #include <GLES2/gl2.h>
- #include <GLES2/gl2ext.h>
- #include <stdio.h>
- #include <stdlib.h>
- #include <stdio.h>
- #include "render_opengles20.h"
- const char RenderOpenGles20::g_indices[] = { 0, 3, 2, 0, 2, 1 };
- const char RenderOpenGles20::g_vertextShader[] = {
- "attribute vec4 aPosition;\n"
- "attribute vec2 aTextureCoord;\n"
- "varying vec2 vTextureCoord;\n"
- "void main() {\n"
- " gl_Position = aPosition;\n"
- " vTextureCoord = aTextureCoord;\n"
- "}\n" };
- // The fragment shader.
- // Do YUV to RGB565 conversion.
- const char RenderOpenGles20::g_fragmentShader[] = {
- "precision mediump float;\n"
- "uniform sampler2D Ytex;\n"
- "uniform sampler2D Utex,Vtex;\n"
- "varying vec2 vTextureCoord;\n"
- "void main(void) {\n"
- " float nx,ny,r,g,b,y,u,v;\n"
- " mediump vec4 txl,ux,vx;"
- " nx=vTextureCoord[0];\n"
- " ny=vTextureCoord[1];\n"
- " y=texture2D(Ytex,vec2(nx,ny)).r;\n"
- " u=texture2D(Utex,vec2(nx,ny)).r;\n"
- " v=texture2D(Vtex,vec2(nx,ny)).r;\n"
- //" y = v;\n"+
- " y=1.1643*(y-0.0625);\n"
- " u=u-0.5;\n"
- " v=v-0.5;\n"
- " r=y+1.5958*v;\n"
- " g=y-0.39173*u-0.81290*v;\n"
- " b=y+2.017*u;\n"
- " gl_FragColor=vec4(r,g,b,1.0);\n"
- "}\n" };
- RenderOpenGles20::RenderOpenGles20() :
- _id(0),
- _textureWidth(-1),
- _textureHeight(-1)
- {
- WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, "%s: id %d",
- __FUNCTION__, (int) _id);
- const GLfloat vertices[20] = {
- // X, Y, Z, U, V
- -1, -1, 0, 1, 0, // Bottom Left
- 1, -1, 0, 0, 0, //Bottom Right
- 1, 1, 0, 0, 1, //Top Right
- -1, 1, 0, 1, 1 }; //Top Left
- memcpy(_vertices, vertices, sizeof(_vertices));
- }
- RenderOpenGles20::~RenderOpenGles20() {
- glDeleteTextures(3, _textureIds);
- }
- int32_t RenderOpenGles20::Setup(int32_t width, int32_t height) {
- WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id,
- "%s: width %d, height %d", __FUNCTION__, (int) width,
- (int) height);
- printGLString("Version", GL_VERSION);
- printGLString("Vendor", GL_VENDOR);
- printGLString("Renderer", GL_RENDERER);
- printGLString("Extensions", GL_EXTENSIONS);
- int maxTextureImageUnits[2];
- int maxTextureSize[2];
- glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS, maxTextureImageUnits);
- glGetIntegerv(GL_MAX_TEXTURE_SIZE, maxTextureSize);
- WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id,
- "%s: number of textures %d, size %d", __FUNCTION__,
- (int) maxTextureImageUnits[0], (int) maxTextureSize[0]);
- _program = createProgram(g_vertextShader, g_fragmentShader);
- if (!_program) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not create program", __FUNCTION__);
- return -1;
- }
- int positionHandle = glGetAttribLocation(_program, "aPosition");
- checkGlError("glGetAttribLocation aPosition");
- if (positionHandle == -1) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not get aPosition handle", __FUNCTION__);
- return -1;
- }
- int textureHandle = glGetAttribLocation(_program, "aTextureCoord");
- checkGlError("glGetAttribLocation aTextureCoord");
- if (textureHandle == -1) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not get aTextureCoord handle", __FUNCTION__);
- return -1;
- }
- // set the vertices array in the shader
- // _vertices contains 4 vertices with 5 coordinates.
- // 3 for (xyz) for the vertices and 2 for the texture
- glVertexAttribPointer(positionHandle, 3, GL_FLOAT, false,
- 5 * sizeof(GLfloat), _vertices);
- checkGlError("glVertexAttribPointer aPosition");
- glEnableVertexAttribArray(positionHandle);
- checkGlError("glEnableVertexAttribArray positionHandle");
- // set the texture coordinate array in the shader
- // _vertices contains 4 vertices with 5 coordinates.
- // 3 for (xyz) for the vertices and 2 for the texture
- glVertexAttribPointer(textureHandle, 2, GL_FLOAT, false, 5
- * sizeof(GLfloat), &_vertices[3]);
- checkGlError("glVertexAttribPointer maTextureHandle");
- glEnableVertexAttribArray(textureHandle);
- checkGlError("glEnableVertexAttribArray textureHandle");
- glUseProgram(_program);
- int i = glGetUniformLocation(_program, "Ytex");
- checkGlError("glGetUniformLocation");
- glUniform1i(i, 0); /* Bind Ytex to texture unit 0 */
- checkGlError("glUniform1i Ytex");
- i = glGetUniformLocation(_program, "Utex");
- checkGlError("glGetUniformLocation Utex");
- glUniform1i(i, 1); /* Bind Utex to texture unit 1 */
- checkGlError("glUniform1i Utex");
- i = glGetUniformLocation(_program, "Vtex");
- checkGlError("glGetUniformLocation");
- glUniform1i(i, 2); /* Bind Vtex to texture unit 2 */
- checkGlError("glUniform1i");
- glViewport(0, 0, width, height);
- checkGlError("glViewport");
- return 0;
- }
- // SetCoordinates
- // Sets the coordinates where the stream shall be rendered.
- // Values must be between 0 and 1.
- int32_t RenderOpenGles20::SetCoordinates(int32_t zOrder,
- const float left,
- const float top,
- const float right,
- const float bottom) {
- if ((top > 1 || top < 0) || (right > 1 || right < 0) ||
- (bottom > 1 || bottom < 0) || (left > 1 || left < 0)) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Wrong coordinates", __FUNCTION__);
- return -1;
- }
- // X, Y, Z, U, V
- // -1, -1, 0, 0, 1, // Bottom Left
- // 1, -1, 0, 1, 1, //Bottom Right
- // 1, 1, 0, 1, 0, //Top Right
- // -1, 1, 0, 0, 0 //Top Left
- // Bottom Left
- _vertices[0] = (left * 2) - 1;
- _vertices[1] = -1 * (2 * bottom) + 1;
- _vertices[2] = zOrder;
- //Bottom Right
- _vertices[5] = (right * 2) - 1;
- _vertices[6] = -1 * (2 * bottom) + 1;
- _vertices[7] = zOrder;
- //Top Right
- _vertices[10] = (right * 2) - 1;
- _vertices[11] = -1 * (2 * top) + 1;
- _vertices[12] = zOrder;
- //Top Left
- _vertices[15] = (left * 2) - 1;
- _vertices[16] = -1 * (2 * top) + 1;
- _vertices[17] = zOrder;
- return 0;
- }
- GLuint RenderOpenGles20::loadShader(GLenum shaderType, const char* pSource)
- {
- GLuint shader = glCreateShader(shaderType);
- if (shader) {
- glShaderSource(shader, 1, &pSource, NULL);
- glCompileShader(shader);
- GLint compiled = 0;
- glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled);
- if (!compiled) {
- GLint infoLen = 0;
- glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &infoLen);
- if (infoLen) {
- char* buf = (char*) malloc(infoLen);
- if (buf) {
- glGetShaderInfoLog(shader, infoLen, NULL, buf);
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not compile shader %d: %s",
- __FUNCTION__, shaderType, buf);
- free(buf);
- }
- glDeleteShader(shader);
- shader = 0;
- }
- }
- }
- return shader;
- }
- GLuint RenderOpenGles20::createProgram(const char* pVertexSource,
- const char* pFragmentSource) {
- GLuint vertexShader = loadShader(GL_VERTEX_SHADER, pVertexSource);
- if (!vertexShader) {
- return 0;
- }
- GLuint pixelShader = loadShader(GL_FRAGMENT_SHADER, pFragmentSource);
- if (!pixelShader) {
- return 0;
- }
- GLuint program = glCreateProgram();
- if (program) {
- glAttachShader(program, vertexShader);
- checkGlError("glAttachShader");
- glAttachShader(program, pixelShader);
- checkGlError("glAttachShader");
- glLinkProgram(program);
- GLint linkStatus = GL_FALSE;
- glGetProgramiv(program, GL_LINK_STATUS, &linkStatus);
- if (linkStatus != GL_TRUE) {
- GLint bufLength = 0;
- glGetProgramiv(program, GL_INFO_LOG_LENGTH, &bufLength);
- if (bufLength) {
- char* buf = (char*) malloc(bufLength);
- if (buf) {
- glGetProgramInfoLog(program, bufLength, NULL, buf);
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "%s: Could not link program: %s",
- __FUNCTION__, buf);
- free(buf);
- }
- }
- glDeleteProgram(program);
- program = 0;
- }
- }
- return program;
- }
- void RenderOpenGles20::printGLString(const char *name, GLenum s) {
- const char *v = (const char *) glGetString(s);
- WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, "GL %s = %s\n",
- name, v);
- }
- void RenderOpenGles20::checkGlError(const char* op) {
- #ifdef ANDROID_LOG
- for (GLint error = glGetError(); error; error = glGetError()) {
- WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
- "after %s() glError (0x%x)\n", op, error);
- }
- #else
- return;
- #endif
- }
- static void InitializeTexture(int name, int id, int width, int height) {
- glActiveTexture(name);
- glBindTexture(GL_TEXTURE_2D, id);
- glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
- glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
- glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
- glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
- glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0,
- GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL);
- }
- // Uploads a plane of pixel data, accounting for stride != width*bpp.
- static void GlTexSubImage2D(GLsizei width, GLsizei height, int stride,
- const uint8_t* plane) {
- if (stride == width) {
- // Yay! We can upload the entire plane in a single GL call.
- glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, width, height, GL_LUMINANCE,
- GL_UNSIGNED_BYTE,
- static_cast<const GLvoid*>(plane));
- } else {
- // Boo! Since GLES2 doesn't have GL_UNPACK_ROW_LENGTH and Android doesn't
- // have GL_EXT_unpack_subimage we have to upload a row at a time. Ick.
- for (int row = 0; row < height; ++row) {
- glTexSubImage2D(GL_TEXTURE_2D, 0, 0, row, width, 1, GL_LUMINANCE,
- GL_UNSIGNED_BYTE,
- static_cast<const GLvoid*>(plane + (row * stride)));
- }
- }
- }
- int32_t RenderOpenGles20::Render(void * data, int32_t widht, int32_t height)
- {
- WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, "%s: id %d",
- __FUNCTION__, (int) _id);
- glUseProgram(_program);
- checkGlError("glUseProgram");
- if (_textureWidth != (GLsizei) widht || _textureHeight != (GLsizei) height) {
- SetupTextures(widht, height);
- }
- UpdateTextures(data, widht, height);
- glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_BYTE, g_indices);
- checkGlError("glDrawArrays");
- return 0;
- }
- void RenderOpenGles20::SetupTextures(int32_t width, int32_t height)
- {
- glDeleteTextures(3, _textureIds);
- glGenTextures(3, _textureIds); //Generate the Y, U and V texture
- InitializeTexture(GL_TEXTURE0, _textureIds[0], width, height);
- InitializeTexture(GL_TEXTURE1, _textureIds[1], width / 2, height / 2);
- InitializeTexture(GL_TEXTURE2, _textureIds[2], width / 2, height / 2);
- checkGlError("SetupTextures");
- _textureWidth = width;
- _textureHeight = height;
- }
- void RenderOpenGles20::UpdateTextures(void* data, int32_t widht, int32_t height)
- {
- glActiveTexture(GL_TEXTURE0);
- glBindTexture(GL_TEXTURE_2D, _textureIds[0]);
- glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, widht, height, GL_LUMINANCE, GL_UNSIGNED_BYTE,
- data);
- glActiveTexture(GL_TEXTURE1);
- glBindTexture(GL_TEXTURE_2D, _textureIds[1]);
- glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, widht / 2, height / 2, GL_LUMINANCE,
- GL_UNSIGNED_BYTE, (char *)data + widht * height);
- glActiveTexture(GL_TEXTURE2);
- glBindTexture(GL_TEXTURE_2D, _textureIds[2]);
- glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, widht / 2, height / 2, GL_LUMINANCE,
- GL_UNSIGNED_BYTE, (char *)data + widht * height * 5 / 4);
- checkGlError("UpdateTextures");
- }
H264Decoder.cpp (解码代码,前面的博客贴过代码,这里就不贴了)
第三步:编译jni,生成so文件
第四步:把生成的so文件拷贝到android工程里面去,这里贴一下我的Activity代码,如下:
- package com.example.filltriangle;
- import java.io.IOException;
- import java.io.InputStream;
- import hzcw.opengl.ViERenderer;
- import android.app.Activity;
- import android.os.Bundle;
- import android.os.Environment;
- import android.util.Log;
- import android.view.SurfaceView;
- public class FillTriangle extends Activity {
- private SurfaceView mView = null;
- static {
- System.loadLibrary("MICloudPub");
- }
- @Override protected void onCreate(Bundle icicle) {
- super.onCreate(icicle);
- mView = ViERenderer.CreateRenderer(this, true);
- if (mView == null) {
- Log.i("test", "mView is null");
- }
- setContentView(mView);
- GL2JNILib.init(mView);
- new MyThread().start();
- }
- public class MyThread extends Thread {
- public void run() {
- GL2JNILib.step("/sdcard/test.264");
- }
- }
- }
这个demo就是读一个视频文件,解码以后在界面显示出来。便于运行,最后上效果图哈,免得有人怀疑项目真实性。
这里打一个小广告,希望大家支持一下:
1.寻找android开发团队或者个人,有兴趣一起创业的朋友,(最好是成都本地的朋友,也欢迎有激情的应届毕业生加入)联系QQ:276775937。
2.承接ios,android应用系统开发,我们有很多成熟的案例。
原文:http://blog.csdn.net/cjj198561/article/details/34136187
- 用jni实现基于opengl的yuv格式的视频渲染
- 用jni实现基于opengl的yuv格式的视频渲染
- 基于 qt 的 Opengl 渲染 YUV
- HDR渲染器的实现(基于OpenGL)
- HDR渲染器的实现(基于OpenGL)
- YUV 格式的视频呈现
- YUV格式视频的解析
- 基于Surface的视频编解码与OpenGL ES渲染
- 基于Surface的视频编解码与OpenGL ES渲染
- iOS中OpenGL-ES渲染YUV视频
- 基于OpenGL的渲染引擎
- android jni基于ffmpeg,opengles,egl的yuv视频播放功能
- 基于opengl的地图渲染引擎设计与实现
- YUV格式视频的转置
- bmp转yuv格式的视频
- 关于视频的YUV格式介绍
- yuv视频用opengl播放
- 通过opengl来实现yuv的显示
- Navicat 执行“source f:/testsql” 提示1064错误
- 基于 Open vSwitch 的 OpenFlow 实践
- 常见硬件的设备类GUID
- printf ("%2d\n",a);2是什么意思?
- 【iOS开发-61】更换plist资源后,运行程序iOS模拟器仍然显示上一次数据的样子,怎么解决?
- 用jni实现基于opengl的yuv格式的视频渲染
- 每天一个linux命令:wc命令
- Computer Science 学习第四章--CPU 指令集及指令处理
- 记账软件模型(一)
- centos yum安装rrdtool笔记
- 就医160介绍
- Python教程:pythonwin安装与下载
- 渡渡鸟情趣用品盘点最常用的几大成人用品!
- 用Qt获取系统可用的串口信息