[OpenGL]从零开始写一个Android平台下的全景视频播放器——5.1 使用OpenGL把全景视频贴到球上

来源:互联网 发布:flume采集mysql数据库 编辑:程序博客网 时间:2024/06/08 21:49

Github项目地址

为了方便没有准备好梯子的同学,我把项目在CSDN上打包下载,不过更新会慢一些

回到目录

在第三章我们用OpenGL画了一个没有纹理的球,现在我们需要通过纹理映射将视频贴到球上。

这个和画个矩形贴平面视频有任何的区别吗?
。。。。。。
好像没有

带纹理映射的Sphere类

直接贴代码吧,在之前的Sphere基础上,我们加入纹理映射的部分,将3D点和2D点对应起来。

PS:
有一个需要注意的问题是,当用户旋转手机时,球应该朝着反方向旋转,但是传感器的旋转矢量矩阵刚好是model矩阵的逆矩阵,所以现在刚好不需要改变纹理的顺序。
准确的说,在使用旋转矩阵之前应该求逆,而旋转矩阵的转置就是它的逆矩阵,刚巧旋转矩阵是行主序,而OpenGL的矩阵是列主序,所以刚好不用求逆(多么美好的巧合。。)
上面段话看不懂完全不影响,具体原理在第四章和下一节都有介绍

package com.martin.ads.panoramaopengltutorial;import android.opengl.GLES20;import java.nio.ByteBuffer;import java.nio.ByteOrder;import java.nio.FloatBuffer;import java.nio.ShortBuffer;import static com.martin.ads.panoramaopengltutorial.ShaderUtils.checkGlError;/** * Created by Ads on 2016/11/5. */public class Sphere {    private static final int sPositionDataSize = 3;    private static final int sTextureCoordinateDataSize = 2;    private FloatBuffer mVerticesBuffer;    private FloatBuffer mTexCoordinateBuffer;    private ShortBuffer indexBuffer;    private int mNumIndices;    /**     * modified from hzqiujiadi on 16/1/8.     * original source code:     * https://github.com/shulja/viredero/blob/a7d28b21d762e8479dc10cde1aa88054497ff649/viredroid/src/main/java/org/viredero/viredroid/Sphere.java     * @param radius 半径,半径应该在远平面和近平面之间     * @param rings     * @param sectors     */    public Sphere(float radius, int rings, int sectors) {        final float PI = (float) Math.PI;        final float PI_2 = (float) (Math.PI / 2);        float R = 1f/(float)rings;        float S = 1f/(float)sectors;        short r, s;        float x, y, z;        int numPoint = (rings + 1) * (sectors + 1);        float[] vertexs = new float[numPoint * 3];        float[] texcoords = new float[numPoint * 2];        short[] indices = new short[numPoint * 6];        //纹理映射2d-3d        int t = 0, v = 0;        for(r = 0; r < rings + 1; r++) {            for(s = 0; s < sectors + 1; s++) {                x = (float) (Math.cos(2*PI * s * S) * Math.sin( PI * r * R ));                y = (float) Math.sin( -PI_2 + PI * r * R );                z = (float) (Math.sin(2*PI * s * S) * Math.sin( PI * r * R ));                texcoords[t++] = s*S;                texcoords[t++] = r*R;                vertexs[v++] = x * radius;                vertexs[v++] = y * radius;                vertexs[v++] = z * radius;            }        }        //球体绘制坐标索引,用于  glDrawElements        int counter = 0;        int sectorsPlusOne = sectors + 1;        for(r = 0; r < rings; r++){            for(s = 0; s < sectors; s++) {                indices[counter++] = (short) (r * sectorsPlusOne + s);       //(a)                indices[counter++] = (short) ((r+1) * sectorsPlusOne + (s));    //(b)                indices[counter++] = (short) ((r) * sectorsPlusOne + (s+1));  // (c)                indices[counter++] = (short) ((r) * sectorsPlusOne + (s+1));  // (c)                indices[counter++] = (short) ((r+1) * sectorsPlusOne + (s));    //(b)                indices[counter++] = (short) ((r+1) * sectorsPlusOne + (s+1));  // (d)            }        }        // initialize vertex byte buffer for shape coordinates        ByteBuffer bb = ByteBuffer.allocateDirect(                // (# of coordinate values * 4 bytes per float)                vertexs.length * 4);        bb.order(ByteOrder.nativeOrder());        FloatBuffer vertexBuffer = bb.asFloatBuffer();        vertexBuffer.put(vertexs);        vertexBuffer.position(0);        // initialize vertex byte buffer for shape coordinates        ByteBuffer cc = ByteBuffer.allocateDirect(                texcoords.length * 4);        cc.order(ByteOrder.nativeOrder());        FloatBuffer texBuffer = cc.asFloatBuffer();        texBuffer.put(texcoords);        texBuffer.position(0);        // initialize byte buffer for the draw list        ByteBuffer dlb = ByteBuffer.allocateDirect(                // (# of coordinate values * 2 bytes per short)                indices.length * 2);        dlb.order(ByteOrder.nativeOrder());        indexBuffer = dlb.asShortBuffer();        indexBuffer.put(indices);        indexBuffer.position(0);        mTexCoordinateBuffer=texBuffer;        mVerticesBuffer=vertexBuffer;        mNumIndices=indices.length;    }    public void uploadVerticesBuffer(int positionHandle){        FloatBuffer vertexBuffer = getVerticesBuffer();        if (vertexBuffer == null) return;        vertexBuffer.position(0);        GLES20.glVertexAttribPointer(positionHandle, sPositionDataSize, GLES20.GL_FLOAT, false, 0, vertexBuffer);        checkGlError("glVertexAttribPointer maPosition");        GLES20.glEnableVertexAttribArray(positionHandle);        checkGlError("glEnableVertexAttribArray maPositionHandle");    }    public void uploadTexCoordinateBuffer(int textureCoordinateHandle){        FloatBuffer textureBuffer = getTexCoordinateBuffer();        if (textureBuffer == null) return;        textureBuffer.position(0);        GLES20.glVertexAttribPointer(textureCoordinateHandle, sTextureCoordinateDataSize, GLES20.GL_FLOAT, false, 0, textureBuffer);        checkGlError("glVertexAttribPointer maTextureHandle");        GLES20.glEnableVertexAttribArray(textureCoordinateHandle);        checkGlError("glEnableVertexAttribArray maTextureHandle");    }    public FloatBuffer getVerticesBuffer() {        return mVerticesBuffer;    }    public FloatBuffer getTexCoordinateBuffer() {        return mTexCoordinateBuffer;    }    public void draw() {        indexBuffer.position(0);        GLES20.glDrawElements(GLES20.GL_TRIANGLES, mNumIndices, GLES20.GL_UNSIGNED_SHORT, indexBuffer);    }}

加入投影变换代码

关于MVP矩阵的内容,之前也有提过一些,具体的我们将在下一节解释

    private float[] modelMatrix = new float[16];    private float[] projectionMatrix=new float[16];    private float[] viewMatrix = new float[16];    private float[] modelViewMatrix = new float[16];    private float[] mMVPMatrix = new float[16];

Sphere类在GLRenderer的构造函数中创建

sphere=new Sphere(18,100,200);

修改onSurfaceChanged,加入相机观察点(视图矩阵)和投影矩阵的初始化,此时我们已经不需要考虑视频的比例,因为视频的比例是确定的(2:1),因此删除updateProjection函数

    @Override    public void onSurfaceChanged(GL10 gl, int width, int height) {        Log.d(TAG, "onSurfaceChanged: "+width+" "+height);        screenWidth=width; screenHeight=height;        float ratio=(float)width/height;        Matrix.perspectiveM(projectionMatrix, 0, 90, ratio, 1f, 500f);        Matrix.setLookAtM(viewMatrix, 0,                0.0f, 0.0f, 0.0f,                0.0f, 0.0f,-1.0f,                0.0f, 1.0f, 0.0f);    }

相机所在点是原点:0.0f, 0.0f, 0.0f
相机的视线朝向:0.0f, 0.0f,-1.0f
相机的正(头顶的)朝向:0.0f, 1.0f, 0.0f
透视投影矩阵的比例即屏幕(视口)比例,视角为90度,近平面为1,远平面500 (球的半径是18,应该保证包含球,不过远平面也可以小一些)

修改onDrawFrame,加入MVP矩阵计算和传递的代码

Matrix.setIdentityM(modelMatrix,0);Matrix.multiplyMM(modelViewMatrix, 0, viewMatrix, 0, modelMatrix, 0);Matrix.multiplyMM(mMVPMatrix, 0, projectionMatrix, 0, modelViewMatrix, 0);GLES20.glUniformMatrix4fv(uMatrixHandle,1,false,mMVPMatrix,0);

因为还没有引入传感器数据,所以modelMatrix暂时设定为单位阵,MVP矩阵的相乘顺序为projectionviewmodel,刚好和MVP相反

sphere.uploadVerticesBuffer(aPositionHandle);        sphere.uploadTexCoordinateBuffer(aTextureCoordHandle);

运行一下,应该就能看到效果了,但是不能缩放也不能拖动(因为视角太大,周围会有一些形变)
这里写图片描述

Github项目地址
回到目录

0 0
原创粉丝点击