UIImage to GLuint

来源:互联网 发布:网卡mac地址全是0 编辑:程序博客网 时间:2024/05/14 04:00

Reading the Pixel Data

Our first step is we have to somehow give the image data to OpenGL.

The problem is OpenGL doesn’t accept images the way it’s handy to use them as programmers (paths to PNGs). Instead, OpenGL requires you to send them as a buffer of pixel data – and you need to specify the exact format.

Luckily, you can get this buffer of pixel data quite easily using some built-in Quartz2D functions. If you’ve read the Core Graphics 101 tutorial series, many of these calls will look familiar.

There are four main steps to get this to work:

  1. Get Core Graphics image reference. Since we’re going to use Core Graphics to write out the raw pixel data, we need a reference to the image! This is quite simple – UIImage has a CGImageRef property we can use.
  2. Create Core Graphics bitmap context. The next step is to create a Core Graphics bitmap context, which is a fancy way of saying a buffer in memory to store the raw pixel data.
  3. Draw the image into the context. We can do this with a simple Core Graphics function call – and then the buffer will contain raw pixel data!
  4. Send the pixel data to OpenGL. To do this, we need to create an OpenGL texture object and get its unique ID (called it’s “name”), and then we use a function call to pass the pixel data to OpenGL.

OK, so let’s see what this looks like in code. In OpenGLView.m, add a new method right above initWithFrame:

- (GLuint)setupTexture:(NSString *)fileName {        // 1    CGImageRef spriteImage = [UIImage imageNamed:fileName].CGImage;    if (!spriteImage) {        NSLog(@"Failed to load image %@", fileName);        exit(1);    }     // 2    size_t width = CGImageGetWidth(spriteImage);    size_t height = CGImageGetHeight(spriteImage);     GLubyte * spriteData = (GLubyte *) calloc(width*height*4, sizeof(GLubyte));     CGContextRef spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width*4,         CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);         // 3    CGContextDrawImage(spriteContext, CGRectMake(0, 0, width, height), spriteImage);     CGContextRelease(spriteContext);     // 4    GLuint texName;    glGenTextures(1, &texName);    glBindTexture(GL_TEXTURE_2D, texName);     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);      glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);     free(spriteData);            return texName;    }



There’s a lot of code here, so let’s go over it section by section.

1) Get Core Graphics image reference. As you can see this is the simplest step. We just use the UIImage imageNamed initializer I’m sure you’ve seen many times, and then access its CGImage property.

2) Create Core Graphics bitmap context. To create a bitmap context, you have to allocate space for it yourself. Here we use some function calls to get the width and height of the image, and then allocate width*height*4 bytes.

“Why times 4?” you may wonder. When we call the method to draw the image data, it will write one byte each for red, green, blue, and alpha – so 4 bytes in total.

“Why 1 byte per each?” you may wonder. Well, we tell Core Graphics to do this when we set up the context. The fourth parameter to CGBitmapContextCreate is the bits per component, and we set this to 8 bits (1 byte).

3) Draw the image into the context. This is also a pretty simiple step – we just tell Core Graphics to draw the image at the specified rectangle. Since we’re done with the context at this point, we can release it.

4) Send the pixel data to OpenGL. We first need to call glGenTextures to create a texture object and give us its unique ID (called “name”).

We then call glBindTexture to load our new texture name into the current texture unit.

The next step is to set a texture parameter for our texture, using glTexParameteri. Here we’re setting the GL_TEXTURE_MIN_FILTER (the case where we have to shrink the texture for far away objects) to GL_NEAREST (when drawing a vertex, choose the closest corresponding texture pixel).

Another easy way to think of GL_NEAREST is “pixel art-like” while GL_LINEAR is “smooth”.

Note: Setting the GL_TEXTURE_MIN_FILTER is actually required if you aren’t using mipmaps (like this case!) I didn’t know this at first and didn’t include this line, and nothing showed up. I found out later on that this is actually listed in the OpenGL common mistakes – d’oh!

The final step is to send the pixel data buffer we created earlier over to OpenGL with glTexImage2D. When you call this function, you specify the format of the pixel data you send in. Here we specify GL_RGBA and GL_UNSIGNED_BYTE to say “there’s 1 byte for red, green, blue, and alpha.”

OpenGL supports other pixel formats if you’d like (this is how the Cocos2D pixel formats work). But for this tutorial, we’ll stick with this simple case.

Once we’ve sent the image data to OpenGL, we can deallocate the pixel buffer – we don’t need it anymore because OpenGL is storing the texture in the GPU. We finish by returning the texture name, which we’ll need to refer to it later when drawing.


来自:http://www.raywenderlich.com/4404/opengl-es-2-0-for-iphone-tutorial-part-2-textures

原创粉丝点击