OPen GL 学习 (一)2D纹理使用

来源:互联网 发布:淘宝卖家怎么看权重 编辑:程序博客网 时间:2024/05/21 06:55

OPen GL 学习 (一)

这段时间玩 TempRun这款游戏,确实被其深深地振憾,从游戏策划到游戏玩法,确实牛。

做为程序员,一个通病就是,玩其它的游戏时,往往会想到如果自己也能做出这样的游戏该多好啊。 为此, 我开始了OpenGL的学习。

先说一下, OpenGL1.0和OpenGL2.0的区别

作为iPhone SDK 3.0和iPhone 3GS的一部分,Apple 升级了新设备上的OpenGL硬件,使得新硬件能够充分利用新的OpenGL2.0规范。

1.1和2.0版本之间最大的区别也许就是2.0实现了着色器(shader),当然还有其它的差别,但这一点却是大部分人最津津乐道的。

什么是着色器呢? 着色是OpenGL将纹理和顶点位置、光照、和颜色信息以及其它变量混合的过程,最终得出对应像素在屏幕上显示的一种颜色。

由于Opengl1.1是一个固定的流水线,因此无法改变三角形着色的方式。不过这并不意味着不能通过Opengl1.1来完成这样的需求,只不过是有些复杂而已啦。

Oepngl2.0是一个可编程的流水线,这意味着可以定义三角形中像素绘制的函数, OpenGL Shading Language (GLSL)是一种专门用于着色器的语言,通过这种语言可以定义着色器。定义着色器之后可以将着色器提交给渲染流水线,就好像是提交纹理的方式一样,因此对于不同的材质和对象可以使用不同的着色器。


在程序中使用OpenGL的两种方式,是2DTexture和3D模型加载

先说说2DTexture加载方式

外层的调用

// called once every frame

-(void)render

{

// clear the matrix

glPushMatrix();

glLoadIdentity();

// rotate

glRotatef(xRotation,1.0f,0.0f,0.0f);

glRotatef(yRotation,0.0f,1.0f,0.0f);

glRotatef(zRotation,0.0f,0.0f,1.0f);

//scale

glScalef(xScale,yScale,zScale);

[meshrender];

//restore the matrix

glPopMatrix();

}

上面部分主要是设置了外部的状态,

然后在mesh中, 需要加载图册文件,图册文件是指有很多的小图组成,然后再配上一个plist文件,来说明各个小图的大小,位置,名称。把整个图册文件加载后,程序就可以通过这个plist文件来取图册中任意指定的小图,作为纹理,使用图册的好处主要是减少纹理的加载次数,这样子可以有效的降低系统开销。

通过我们会把这些纹理存储在一个一个shared对象中,然后程序中只要是使用到纹理,都向这个shared对象进行请求,如果发现请求的不在这个库中,就立即进行加载,这也是一种延迟加载的方式。

最后,通过应用纹理,就可以把纹理写上去了。



mesh.m文件如下:(代码就不再进行分析了)

- (void)textureForSingleData:(NSDictionary *)record atlasSize:(CGSize)atlasSize{

    GLfloat xLocation = [[record objectForKey:@"xLocation"]floatValue];

GLfloat yLocation = [[recordobjectForKey:@"yLocation"]floatValue];

GLfloat width = [[recordobjectForKey:@"width"]floatValue];

GLfloat height = [[recordobjectForKey:@"height"]floatValue];

// find the normalized texture coordinates

GLfloat uMin = xLocation/atlasSize.width;

GLfloat vMin = yLocation/atlasSize.height;

GLfloat uMax = (xLocation + width)/atlasSize.width;

GLfloat vMax = (yLocation + height)/atlasSize.height;

    

    SingleTextureData *singe = [[SingleTextureDataalloc]init];

    singe.uvCoordinates[0] = uMin;

singe.uvCoordinates[1] = vMax;

    

singe.uvCoordinates[2] = uMax;

singe.uvCoordinates[3] = vMax;

singe.uvCoordinates[4] = uMin;

singe.uvCoordinates[5] = vMin;

    

singe.uvCoordinates[6] = uMax;

singe.uvCoordinates[7] = vMin;

    

    [quadLibrarysetObject:singeforKey:[recordobjectForKey:@"name"]];

    [singe release];

}


// 加载大图册

-(CGSize)loadTextureImage:(NSString*)imageName materialKey:(NSString*)materialKey

{

CGContextRef spriteContext;

GLubyte *spriteData;

size_twidth, height;

GLuint textureID;

// grab the image off the file system, jam it into a CGImageRef

UIImage*uiImage = [[UIImagealloc]initWithContentsOfFile:[[NSBundlemainBundle]pathForResource:imageNameofType:nil]];

CGImageRef spriteImage = [uiImageCGImage];

// Get the width and height of the image

width =CGImageGetWidth(spriteImage);

height =CGImageGetHeight(spriteImage);

CGSize imageSize =CGSizeMake(width, height);

// Texture dimensions must be a power of 2. If you write an application that allows users to supply an image,

// you'll want to add code that checks the dimensions and takes appropriate action if they are not a power of 2.

if(spriteImage) {

// Allocated memory needed for the bitmap context

spriteData = (GLubyte *)malloc(width * height *4);

memset(spriteData,0, (width * height *4));

// Uses the bitmatp creation function provided by the Core Graphics framework.

spriteContext =CGBitmapContextCreate(spriteData, width, height,8, width *4,CGImageGetColorSpace(spriteImage),kCGImageAlphaPremultipliedLast);

// After you create the context, you can draw the sprite image to the context.

CGContextDrawImage(spriteContext,CGRectMake(0.0,0.0, (CGFloat)width, (CGFloat)height), spriteImage);

// You don't need the context at this point, so you need to release it to avoid memory leaks.

CGContextRelease(spriteContext);

// Use OpenGL ES to generate a name for the texture.

glGenTextures(1, &textureID);

// Bind the texture name.

glBindTexture(GL_TEXTURE_2D, textureID);

// Specidfy a 2D texture image, provideing the a pointer to the image data in memory

        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);

free(spriteData);

// Release the image data

// Set the texture parameters to use a minifying filter and a linear filer (weighted average)

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

// Enable use of the texture

glEnable(GL_TEXTURE_2D);

// Set a blending function to use

glBlendFunc(GL_ONE,GL_ONE_MINUS_SRC_ALPHA);

// Enable blending

glEnable(GL_BLEND);

}else {

returnCGSizeZero;

}

[uiImagerelease];

if (materialLibrary ==nil)materialLibrary = [[NSMutableDictionaryalloc]init];

    

// now put the texture ID into the library

[materialLibrarysetObject:[NSNumbernumberWithUnsignedInt:textureID]forKey:materialKey];

return imageSize;

}


-(void)loadAtlasData:(NSString*)atlasName

{

NSAutoreleasePool * apool = [[NSAutoreleasePoolalloc]init];

if (quadLibrary ==nil)quadLibrary = [[NSMutableDictionaryalloc]init];

    // 加载整个图册

CGSize atlasSize = [selfloadTextureImage:[atlasNamestringByAppendingPathExtension:@"png"]materialKey:atlasName];

    

NSArray * itemData = [NSArrayarrayWithContentsOfFile:[[NSBundlemainBundle]pathForResource:atlasNameofType:@"plist"]];

    //取图册中的每一个小图

for (NSDictionary * recordin itemData) {

        [selftextureForSingleData:recordatlasSize:atlasSize];

}

[apoolrelease];

}


- (id)initWithVertexes:(CGFloat*)verts 

vertexCount:(NSInteger)vertCount 

vertexSize:(NSInteger)vertSize

renderStyle:(GLenum)style;

{

self = [superinit];

if (self !=nil) {

self.vertexes = verts;

self.vertexCount = vertCount;

self.vertexSize = vertSize;

self.renderStyle = style;

        uvCoordinates = (CGFloat *) malloc(8 * sizeof(CGFloat));

}

    // 加载这个大图

    [selfloadAtlasData:@"SpaceRocksAtlas"];

    

returnself;

}


// grabs the openGL texture ID from the library and calls the openGL bind texture method

//-(void)bindMaterial:(NSString*)materialKey

//{

// NSNumber * numberObj = [materialLibrary objectForKey:materialKey];

// if (numberObj == nil) return;

//

// GLuint textureID = [numberObj unsignedIntValue];

//    glEnable(GL_TEXTURE_2D);

// glBindTexture(GL_TEXTURE_2D, textureID);

//}


- (void)bindTexture:(NSString *)textureString {

//    NSLog(@"quadLibrary=%@", quadLibrary);

    SingleTextureData *singleObje = (SingleTextureData *)[quadLibraryobjectForKey:textureString];

    if (singleObje != nil) {

        glEnableClientState(GL_TEXTURE_COORD_ARRAY);

        glTexCoordPointer(2, GL_FLOAT, 0, singleObje.uvCoordinates);


    }else {

        NSLog(@"nil texture");

    }

}


// called once every frame

-(void)render

{

// load arrays into the engine

glVertexPointer(vertexSize,GL_FLOAT,0, vertexes);

glEnableClientState(GL_VERTEX_ARRAY);

glColorPointer(colorSize,GL_FLOAT,0, colors);

glEnableClientState(GL_COLOR_ARRAY);

    

    //指定加载哪个纹理

    [selfbindTexture:@"leftUp"];

        

//render

glDrawArrays(renderStyle,0,vertexCount);

}


- (void) dealloc

{

[superdealloc];

}


上面还用到了SingleTextureData来存储每个小图片的UV坐标,这是指的是其纹理图中,图的有效的部分的坐标。(因为我们在使用图时, 图文件通常是长方形的,但我们使用的角色、纹理通常不是规整的长方形,所以需要这个UV坐标来指明,我们应该怎么使用这个纹理图)

@implementation SingleTextureData

@synthesize uvCoordinates,materialKey;

- (id)init {

    self = [super init];

    if (self != nil) {

        uvCoordinates = (CGFloat *) malloc(8 * sizeof(CGFloat));

    }

    

    return self;

}


- (void)dealloc {

    free(uvCoordinates);

    [superdealloc];

}


@end



原创粉丝点击