IOS Dev Intro - Render YpCbCr
来源:互联网 发布:mac怎么切换苹果系统 编辑:程序博客网 时间:2024/05/16 05:42
http://stackoverflow.com/questions/6432159/render-ypcbcr-iphone-4-camera-frame-to-an-opengl-es-2-0-texture-in-iOS-4-3
I'm trying to render a native planar image to an OpenGL ES 2.0 texture in iOS 4.3 on an iPhone 4. The texture however winds up all black. My camera is configured as such:
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
and I'm passing the pixel data to my texture like this:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bufferWidth, bufferHeight, 0, GL_RGB_422_APPLE, GL_UNSIGNED_SHORT_8_8_REV_APPLE, CVPixelBufferGetBaseAddress(cameraFrame));
My fragement shaders is:
varying highp vec2 textureCoordinate;uniform sampler2D videoFrame; void main() { lowp vec4 color; color = texture2D(videoFrame, textureCoordinate); lowp vec3 convertedColor = vec3(-0.87075, 0.52975, -1.08175); convertedColor += 1.164 * color.g; // Y convertedColor += vec3(0.0, -0.391, 2.018) * color.b; // U convertedColor += vec3(1.596, -0.813, 0.0) * color.r; // V gl_FragColor = vec4(convertedColor, 1.0); }
and my vertex shader is
attribute vec4 position;attribute vec4 inputTextureCoordinate;varying vec2 textureCoordinate;void main(){ gl_Position = position; textureCoordinate = inputTextureCoordinate.xy;}
This works just fine when I'm working with an BGRA image, and my fragment shader only does
gl_FragColor = texture2D(videoFrame, textureCoordinate);
What if anything am I missing here? Thanks!
2 Answers
OK. We have a working success here. The key was passing the Y and the UV as two separate textures to the fragment shader. Here is the final shader:
#ifdef GL_ESprecision mediump float;#endifvarying vec2 textureCoordinate;uniform sampler2D videoFrame; uniform sampler2D videoFrameUV;const mat3 yuv2rgb = mat3( 1, 0, 1.2802, 1, -0.214821, -0.380589, 1, 2.127982, 0 );void main() { vec3 yuv = vec3( 1.1643 * (texture2D(videoFrame, textureCoordinate).r - 0.0625), texture2D(videoFrameUV, textureCoordinate).r - 0.5, texture2D(videoFrameUV, textureCoordinate).a - 0.5 ); vec3 rgb = yuv * yuv2rgb; gl_FragColor = vec4(rgb, 1.0);}
You'll need to create your textures along like this:
int bufferHeight = CVPixelBufferGetHeight(cameraFrame);int bufferWidth = CVPixelBufferGetWidth(cameraFrame);glBindTexture(GL_TEXTURE_2D, videoFrameTexture);glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, bufferWidth, bufferHeight, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddressOfPlane(cameraFrame, 0));glBindTexture(GL_TEXTURE_2D, videoFrameTextureUV);glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_ALPHA, bufferWidth/2, bufferHeight/2, 0, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddressOfPlane(cameraFrame, 1));
and then pass them like this:
glActiveTexture(GL_TEXTURE0);glBindTexture(GL_TEXTURE_2D, videoFrameTexture);glActiveTexture(GL_TEXTURE1);glBindTexture(GL_TEXTURE_2D, videoFrameTextureUV);glActiveTexture(GL_TEXTURE0);glUniform1i(videoFrameUniform, 0);glUniform1i(videoFrameUniformUV, 1);
Boy am I relieved!
P.S. The values for the yuv2rgb matrix are from here http://en.wikipedia.org/wiki/YUV and I copied code from here http://www.ogre3d.org/forums/viewtopic.php?f=5&t=25877 to figure out how to get the correct YUV values to begin with.
Your code appears to attempt to convert a 32-bit colour in 444-plus-unused-byte to RGBA. That's not going to work too well. I don't know of anything that outputs "YUVA", for one.
Also, I think the returned alpha channel is 0 for BGRA camera output, not 1, so I'm not sure why it works (IIRC to convert it to a CGImage you need to use AlphaNoneSkipLast).
The 420 "bi planar" output is structued something like this:
- A header telling you where the planes are (used by
CVPixelBufferGetBaseAddressOfPlane()
and friends) - The Y plane: height × bytes_per_row_1 × 1 bytes
- The Cb,Cr plane: height/2 × bytes_per_row_2 × 2 bytes (2 bytes per 2x2 block).
bytes_per_row_1
is approximately width
and bytes_per_row_2
is approximately width/2
, but you'll want to use CVPixelBufferGetBytesPerRowOfPlane() for robustness (you also might want to check the results of ..GetHeightOfPlane and ...GetWidthOfPlane).
You might have luck treating it as a 1-component width*height texture and a 2-component width/2*height/2 texture. You'll probably want to check bytes-per-row and handle the case where it isn't simply width*number-of-components (although this is probably true for most of the video modes). AIUI, you'll also want to flush the GL context before calling CVPixelBufferUnlockBaseAddress().
Alternatively, you can copy it all to memory into your expected format (optimizing this loop might be a bit tricky). Copying has the advantage that you don't need to worry about things accessing memory after you've unlocked the pixel buffer.
- IOS Dev Intro - Render YpCbCr
- IOS Dev Intro - Subclass UIView to Render by Opengl ES
- IOS Dev Intro - UIControlEvent
- IOS Dev Intro - UIButton
- IOS Dev Intro - UILabel
- IOS Dev Intro - String
- IOS Dev Intro - FBO
- IOS Dev Intro - NSDictionary
- IOS Dev Intro - NSArray
- IOS Dev Intro - NSInteger
- IOS Dev Intro - NSString
- IOS Dev Intro - UISwitch
- IOS Dev Intro - Sandbox
- IOS Dev Intro - NSFileManager
- IOS Dev Intro - NSBundle
- IOS Dev Intro - NSSet
- IOS Dev Intro - Property
- IOS Dev Intro - Block
- 欢迎使用CSDN-markdown编辑器
- Android 使用PLDroidPlayer播放网络视频 根据视频角度自动旋转
- nginx配置系统
- BLE蓝牙4.0经典问答【转】
- 文章标题
- IOS Dev Intro - Render YpCbCr
- AS遇到问题与解决一
- gen already exists but is not a source folder. Convert to a source folder or rename it的错误。
- 几种撑开父类元素的方法
- 【旧代码整理】一个PHP操作mysql的class类
- ADB命令操作手机数据
- drtools规则使用快速入门之一
- tar打包排除某个或多个子目录 | AWK命令
- 剑指offer面试题18:树的子结构