Android纹理仅显示纯色 [英] Android texture only showing solid color

查看:80
本文介绍了Android纹理仅显示纯色的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在四边形上显示单个纹理. 我有一个工作的VertexObject,它很好地绘制了一个正方形(或任何几何对象).现在,我尝试将其扩展为也可以处理纹理,但是纹理不起作用.我只看到一种单色的四边形.

I am trying to display a single texture on a quad. I had a working VertexObject, which drew a square(or any geometric object) fine. Now I tried expanding it to handle textures too, and the textures doesn't work. I only see the quad in one solid color.

坐标数据在arrayList中:

The coordinate data is in an arrayList:

/*the vertices' coordinates*/
public int              coordCount = 0;
/*float array of 3(x,y,z)*/
public ArrayList<Float>     coordList = new ArrayList<Float>(coordCount);

/*the coordinates' indexes(if used)*/
/*maximum limit:32767*/
private int                  orderCount = 0;
private ArrayList<Short>     orderList = new ArrayList<Short>(orderCount);

/*textures*/
public boolean textured;
private boolean textureIsReady;
private ArrayList<Float>    textureList = new ArrayList<Float>(coordCount);
private Bitmap bitmap; //the image to be displayed
private int textures[]; //the textures' ids

在以下函数中初始化缓冲区:

The buffers are initialized in the following function:

/*Drawing is based on the buffers*/
public void refreshBuffers(){
    /*Coordinates' List*/
    float coords[] = new float[coordList.size()];
    for(int i=0;i<coordList.size();i++){
         coords[i]= coordList.get(i);
    }
    // initialize vertex byte buffer for shape coordinates
    ByteBuffer bb = ByteBuffer.allocateDirect(
            // (number of coordinate values * 4 bytes per float)
            coords.length * 4);
    // use the device hardware's native byte order
    bb.order(ByteOrder.nativeOrder());

    // create a floating point buffer from the ByteBuffer
    vertexBuffer = bb.asFloatBuffer();
    // add the coordinates to the FloatBuffer
    vertexBuffer.put(coords);
    // set the buffer to read the first coordinate
    vertexBuffer.position(0);

    /*Index List*/
    short order[] = new short[(short)orderList.size()];
    for(int i=0;i<order.length;i++){
        order[i] = (short) orderList.get(i);
    }
    // initialize byte buffer for the draw list
    ByteBuffer dlb = ByteBuffer.allocateDirect(
    // (# of coordinate values * 2 bytes per short)
            order.length * 2);
    dlb.order(ByteOrder.nativeOrder());
    orderBuffer = dlb.asShortBuffer();
    orderBuffer.put(order);
    orderBuffer.position(0);

    /*texture list*/
    if(textured){
        float textureCoords[] = new float[textureList.size()];
        for(int i=0;i<textureList.size();i++){
            textureCoords[i] = textureList.get(i);
        }
        ByteBuffer byteBuf = ByteBuffer.allocateDirect(textureCoords.length * 4);
        byteBuf.order(ByteOrder.nativeOrder());
        textureBuffer = byteBuf.asFloatBuffer();
        textureBuffer.put(textureCoords);
        textureBuffer.position(0);
    }
}

我使用以下代码将图像加载到对象中:

I load the image into the object with the following code:

public void initTexture(GL10 gl, Bitmap inBitmap){
    bitmap = inBitmap;
    loadTexture(gl);
    textureIsReady = true;
}

/*http://www.jayway.com/2010/12/30/opengl-es-tutorial-for-android-part-vi-textures/*/
public void loadTexture(GL10 gl){

    gl.glGenTextures(1, textures, 0);
    gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);

    gl.glTexParameterx(GL10.GL_TEXTURE_2D, 
                        GL10.GL_TEXTURE_MAG_FILTER, 
                        GL10.GL_LINEAR);

    gl.glTexParameterx(GL10.GL_TEXTURE_2D,
                        GL10.GL_TEXTURE_MIN_FILTER,
                        GL10.GL_LINEAR);

    gl.glTexParameterx(GL10.GL_TEXTURE_2D,
                        GL10.GL_TEXTURE_WRAP_S,
                        GL10.GL_CLAMP_TO_EDGE);

    gl.glTexParameterx(GL10.GL_TEXTURE_2D,
                        GL10.GL_TEXTURE_WRAP_T,
                        GL10.GL_CLAMP_TO_EDGE);

    /*bind bitmap to texture*/
    GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
}

然后根据以下代码进行绘制:

And the drawing happens based on this code:

public void draw(GL10 gl){
    if(textured && textureIsReady){
        gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
        //loadTexture(gl);
        gl.glEnable(GL10.GL_TEXTURE_2D);

        gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
        gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

        gl.glVertexPointer(3, GL10.GL_FLOAT, 0,
                vertexBuffer);
        gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, 
                textureBuffer);
    }else{
        gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
        gl.glColor4f(color[0], color[1], color[2], color[3]);
        gl.glVertexPointer(3, GL10.GL_FLOAT, 0,
                vertexBuffer);
    }
        if(!indexed)gl.glDrawArrays(drawMode, 0, coordCount);
            else gl.glDrawElements(drawMode, orderCount, GL10.GL_UNSIGNED_SHORT, orderBuffer);


    if(textured && textureIsReady){
        gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
        gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
        gl.glDisable(GL10.GL_TEXTURE_2D);
    }else{
        gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
    }
}

初始化如下:

    pic = new VertexObject();
    pic.indexed = true;
    pic.textured = true;

    pic.initTexture(gl,MainActivity.bp);

    pic.color[0] = 0.0f;
    pic.color[1] = 0.0f;
    pic.color[2] = 0.0f;

    float inputVertex[] = {2.0f,2.0f,0.0f};
    float inputTexture[] = {0.0f,0.0f};
    pic.addTexturedVertex(inputVertex,inputTexture);
    inputVertex[0] = 2.0f;
    inputVertex[1] = 8.0f;
    inputTexture[0] = 0.0f;
    inputTexture[0] = 1.0f;
    pic.addTexturedVertex(inputVertex,inputTexture);
    inputVertex[0] = 8.0f;
    inputVertex[1] = 8.0f;
    inputTexture[0] = 1.0f;
    inputTexture[0] = 1.0f;
    pic.addTexturedVertex(inputVertex,inputTexture);
    inputVertex[0] = 8.0f;
    inputVertex[1] = 2.0f;
    inputTexture[0] = 1.0f;
    inputTexture[0] = 0.0f;
    pic.addTexturedVertex(inputVertex,inputTexture);

    pic.addIndex((short)0);
    pic.addIndex((short)1);
    pic.addIndex((short)2);
    pic.addIndex((short)0);
    pic.addIndex((short)2);
    pic.addIndex((short)3);

仅将坐标添加到arrayList中,然后刷新缓冲区. 位图是有效的,因为它显示在imageView上. 图片是可绘制文件夹中的png文件,大小为128x128. 对于我收集的图像,图像到达了vertexObject,但是纹理映射有些不正确.关于我在做什么错的任何指示?

The coordinates are just simply added to the arrayList, and then I refresh the buffers. The bitmap is valid, because it is showing up on an imageView. The image is a png file with the size of 128x128 in the drawable folder. For what I gathered the image is getting to the vertexObject, but something isn't right with the texture mapping. Any pointers on what am I doing wrong?

推荐答案

好的,我明白了!

我下载了

I downloaded a working example from the internet and rewrote it, to resemble the object(presented above) step by step. I observed if it works on every step. Turns out, the problem isn't in the graphical part, because the object worked in another context with different coordinates.

长话短说:

我的纹理UV贴图错误! 这就是为什么我得到纯色,加载了纹理,但是UV贴图不正确的原因.

I got the texture UV mapping wrong! That's why I got the solid color, the texture was loaded, but the UV mapping wasn't correct.

短篇小说:

在线

 inputVertex[0] = 2.0f;
    inputVertex[1] = 8.0f;
    inputTexture[0] = 0.0f;
    inputTexture[0] = 1.0f;

索引错误,因为仅更新了inputTexture的第一个元素.关于描述顶点坐标的不同数组的大小可能存在一些其他错误,但是对链接示例的重写解决了该问题,并且产生了mroe简洁代码.

The indexing was wrong as only the first element of inputTexture was updated only. There might have been some additional errors regarding the sizes of the different array describing the vertex coordinates, but rewriting on the linked example fixed the problem, and it produced a mroe concise code.

这篇关于Android纹理仅显示纯色的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆