问题使用波前.OBJ的纹理坐标在Android中的OpenGL ES [英] Problems Using Wavefront .obj's texture coordinates in Android OpenGL ES

查看:196
本文介绍了问题使用波前.OBJ的纹理坐标在Android中的OpenGL ES的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在写使用OpenGL ES的Andr​​oid应用程序。我跟一些在线教程和使用管理加载一个立方体纹理硬codeD顶点/索引/纹理坐标

I'm writing an android app using openGL ES. I followed some online tutorials and managed to load up a textured cube using hard-coded vertices/indices/texture coordinates

作为下一步我写了波前.OBJ文件的解析器。我用从教程,它加载罚款顶点等做了模拟文件。

As a next step I wrote a parser for wavefront .obj files. I made a mock file using the vertices etc from the tutorial, which loads fine.

然而,当我使用使用3D建模软件包做了一个文件,所有的纹理搞的一团糟。

However, when I use a file made using a 3d modelling package, all the textures get messed up

下面是如何我目前得到的纹理坐标:

Below is how I'm currently getting the texture coordinates:

首先,我加载所有的纹理坐标,在 VT 的成一大载体

First I load all the texture coordinates, the vt's into a big vector

接下来,我发现每个˚F三角形的前两个纹理坐标(,使f 1/2/3 2/5/2 3/4/1意味着我走第二和第五纹理坐标。由于.OBJ开始从1而不是0算起,我已经从位置设置为-1,然后通过2的X坐标位置繁衍的位置,做在y坐标位置相同的,但我的+1 VT 数组)

Next I find the first two texture coordinates for each f triangle (so f 1/2/3 2/5/2 3/4/1 means I take the 2nd and 5th texture coordinates. Since .obj starts counting from 1 not 0, I have to -1 from the position and then multiply the position by 2 for the x coord position and do the same but +1 for the y coord position in my vt array)

我把那些纹理坐标,我只是发现并将其添加到另一个向量。

I take those texture coordinates that I just found and add them to another vector.

一旦我通过所有的顶点了。我把载体引入FloatBuffer,传递,为 glTexCoordPointer 在我的画法

Once I've gone through all the vertices. I turn the vector into a FloatBuffer, passing that to glTexCoordPointer in my draw method

下面是code解析该文件:

Here is the code for parsing the file:

private void openObjFile(String filename, Context context, GL10 gl){

    Vector<String> lines = openFile(filename, context); // opens the file

    Vector<String[]> tokens = new Vector<String[]>();

    Vector<Float> vertices = new Vector<Float>();
    Vector<Float> textureCoordinates = new Vector<Float>();
    Vector<Float> vertexNormals = new Vector<Float>();

    // tokenise
    for(int i = 0;i<lines.size();i++){
        String line = lines.get(i);
        tokens.add(line.split(" "));
    }

    for(int j = 0;j<tokens.size();j++){
        String[] linetokens = tokens.get(j);

        // get rid of comments
        //if(linetokens[0].equalsIgnoreCase("#")){
            //tokens.remove(j);
        //}


        // get texture from .mtl file
        if(linetokens[0].equalsIgnoreCase("mtllib")){
            parseMaterials(linetokens[1],context, gl);

        }

        // vertices
        if(linetokens[0].equalsIgnoreCase("v")){
            vertices.add(Float.valueOf(linetokens[1]));
            vertices.add(Float.valueOf(linetokens[2]));
            vertices.add(Float.valueOf(linetokens[3]));
        }


        // texture coordinates
        if(linetokens[0].equalsIgnoreCase("vt")){

            textureCoordinates.add(Float.valueOf(linetokens[1]));
            textureCoordinates.add(Float.valueOf(linetokens[2]));

        }

        // vertex normals
        if(linetokens[0].equalsIgnoreCase("vn")){

            vertexNormals.add(Float.valueOf(linetokens[1]));
            vertexNormals.add(Float.valueOf(linetokens[2]));
            vertexNormals.add(Float.valueOf(linetokens[3]));
        }

    }

    // vertices     
    this.vertices = GraphicsUtil.getFloatBuffer(vertices);


    Mesh mesh = null;

    Vector<Short> indices = null;
    Vector<Float> textureCoordinatesMesh = null;
    Vector<Float> vertexNormalsMesh = null;

    for(int j = 0;j<tokens.size();j++){



        String[] linetokens = tokens.get(j);

        if(linetokens[0].equalsIgnoreCase("g")){

            if(mesh!=null){

                mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
                mesh.setNumindices(indices.size());
                mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
                mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));

                meshes.add(mesh);

            }

            mesh = new Mesh();
            indices = new Vector<Short>();
            textureCoordinatesMesh = new Vector<Float>();
            vertexNormalsMesh = new Vector<Float>();


        } else if(linetokens[0].equalsIgnoreCase("usemtl")){

            String material_name = linetokens[1];

            for(int mn = 0;mn<materials.size();mn++){

                if(materials.get(mn).getName().equalsIgnoreCase(material_name)){
                    mesh.setTextureID(materials.get(mn).getTextureID());
                    mn = materials.size();
                }

            }

        } else if(linetokens[0].equalsIgnoreCase("f")){

            for(int v = 1;v<linetokens.length;v++){

                String[] vvtvn = linetokens[v].split("/");

                short index = Short.parseShort(vvtvn[0]);
                index -= 1;                 
                indices.add(index);

                if(v!=3){
                    int texturePosition = (Integer.parseInt(vvtvn[1]) - 1) * 2;
                    float xcoord = (textureCoordinates.get(texturePosition));
                    float ycoord = (textureCoordinates.get(texturePosition+1));


                    // normalise
                    if(xcoord>1 || ycoord>1){
                        xcoord = xcoord / Math.max(xcoord, ycoord);
                        ycoord = ycoord / Math.max(xcoord, ycoord);
                    }

                    textureCoordinatesMesh.add(xcoord);
                    textureCoordinatesMesh.add(ycoord);

                }                   

                int normalPosition = (Integer.parseInt(vvtvn[2]) - 1) *3;

                vertexNormalsMesh.add(vertexNormals.get(normalPosition));
                vertexNormalsMesh.add(vertexNormals.get(normalPosition)+1);
                vertexNormalsMesh.add(vertexNormals.get(normalPosition)+2);

            }

        }

    }

    if(mesh!=null){             

        mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
        mesh.setNumindices(indices.size());
        mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
        mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));

        meshes.add(mesh);
    }// Adding the final mesh
}

这里是code为图:

And here is the code for drawing:

public void draw(GL10 gl){

    gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);

    // Counter-clockwise winding.
    gl.glFrontFace(GL10.GL_CCW);
    gl.glEnable(GL10.GL_CULL_FACE);
    gl.glCullFace(GL10.GL_BACK);

    // Pass the vertex buffer in
    gl.glVertexPointer(3, GL10.GL_FLOAT, 0,
                             vertices);

    for(int i=0;i<meshes.size();i++){
        meshes.get(i).draw(gl);
    }

    // Disable the buffers

    gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);

}

public void draw(GL10 gl){



    if(textureID>=0){

        // Enable Textures
        gl.glEnable(GL10.GL_TEXTURE_2D);

        // Get specific texture.
        gl.glBindTexture(GL10.GL_TEXTURE_2D, textureID);

        // Use UV coordinates.
        gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

        // Pass in texture coordinates
        gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureCoordinates);

    } 

    // Pass in texture normals
    gl.glNormalPointer(GL10.GL_FLOAT, 0, normals);

    gl.glEnableClientState(GL10.GL_NORMAL_ARRAY);



        gl.glDrawElements(GL10.GL_TRIANGLES, numindices,GL10.GL_UNSIGNED_SHORT, indices);


    if(textureID>=0){
        // Disable buffers
        gl.glDisableClientState(GL10.GL_NORMAL_ARRAY);
        gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
    }

}

我真的AP preciate任何帮助。这是令人沮丧的是不是,很能加载从文件中的模型,我真的不知道我做错了,或在​​这里失去了

I'd really appreciate any help with this. It is frustrating to be not-quite able to load up the model from file and I'm really not sure what I'm doing wrong or missing here

推荐答案

我不得不承认你的code的框架是一个有点困惑。具体的事情,我希望是一个问题:

I have to admit to being a little confused by the framing of your code. Specific things I would expect to be an issue:


  • 您拒绝复制纹理坐标,以与任何面相关的第三顶点的最终目清单;这应该在前两个
  • 后,把所有的坐标不同步
  • 您的纹理坐标归一化步骤是不必要的 - 在某种程度上,我不知道为什么它在那里 - (如果有什么XCOORD是不是在第一线,然后在第二个较小YCOORD大),可能破

  • OBJ认为(0,0)是一个纹理的左上角,OpenGL的认为它是左下角,所以,除非你已经设置纹理矩阵堆栈反转在code纹理坐标未显示,你需要自己反转时,例如 textureCoordinatesMesh.add(1.0 - YCOORD);

  • you decline to copy a texture coordinate to the final mesh list for the third vertex associated with any face; this should put all of your coordinates out of sync after the first two
  • your texture coordinate normalisation step is unnecessary — to the extent that I'm not sure why it's in there — and probably broken (what if xcoord is larger than ycoord on the first line, then smaller on the second?)
  • OBJ considers (0, 0) to be the top left of a texture, OpenGL considers it to be the bottom left, so unless you've set the texture matrix stack to invert texture coordinates in code not shown, you need to invert them yourself, e.g. textureCoordinatesMesh.add(1.0 - ycoord);

除此之外,一般的OBJ的意见,我敢肯定你已经清楚地意识到,不涉及到这里的问题是,你应该预料到处理不提供不这样做法线和文件的文件供给要么法线或纹理坐标(您目前假设都是present)和OBJ可与顶点的任意数量的保持面,不只是三角形。但他们总是平面凸,所以你可以只画他们作为一个风扇或它们分解成三角形,好像它们是一个迷。

Besides that, generic OBJ comments that I'm sure you're already well aware of and don't relate to the problem here are that you should expect to handle files that don't supply normals and files that don't supply either normals or texture coordinates (you currently assume both are present), and OBJ can hold faces with an arbitrary number of vertices, not just triangles. But they're always planar and convex, so you can just draw them as a fan or break them into triangles as though they were a fan.

这篇关于问题使用波前.OBJ的纹理坐标在Android中的OpenGL ES的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆