glDrawArrays和glDrawElements [英] glDrawArrays vs glDrawElements

查看:142
本文介绍了glDrawArrays和glDrawElements的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

好,所以我仍在努力使它正常工作.我的代码的重要部分是:

def __init__(self, vertices, normals, triangles):
    self.bufferVertices = glGenBuffersARB(1)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferVertices)
    glBufferDataARB(GL_ARRAY_BUFFER_ARB, ADT.arrayByteCount(vertices), ADT.voidDataPointer(vertices), GL_STATIC_DRAW_ARB)
    self.vertices = vertices
    self.bufferNormals = glGenBuffersARB(1)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferNormals)
    glBufferDataARB(GL_ARRAY_BUFFER_ARB, ADT.arrayByteCount(normals), ADT.voidDataPointer(normals), GL_STATIC_DRAW_ARB)
    self.normals = normals
    self.bufferTriangles = glGenBuffersARB(1)

    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)
    glBufferDataARB(GL_ELEMENT_ARRAY_BUFFER_ARB, ADT.arrayByteCount(triangles), ADT.voidDataPointer(triangles), GL_STATIC_DRAW_ARB)

    self.triangles = triangles
    glDisableClientState(GL_VERTEX_ARRAY) **(Not sure if any of the following influence in any way)** 
    glDisableClientState(GL_NORMAL_ARRAY)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0)
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, 0)

从我到目前为止所读到的关于VBO的内容来看,我认为这没有什么错.现在我有了顶点,法线(尚未使用)和三角形索引缓冲区.现在进行实际抽奖:

def draw(self, type):
    glDisableClientState(GL_VERTEX_ARRAY)  
    glDisableClientState(GL_NORMAL_ARRAY)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0)
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, 0)
    **Again above line not sure if they have any use.**        
    glEnableClientState(GL_VERTEX_ARRAY)         
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferVertices)
    glVertexPointer(3, GL_FLOAT, 0, None)

    glEnableClientState(GL_NORMAL_ARRAY);
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferNormals)
    glNormalPointer(GL_FLOAT, 0, None)

    if type == GL_POINTS:    
        #glDrawArrays( GL_POINTS, 0, len(self.vertices) );    
        glDrawElements(type, len(self.vertices), GL_UNSIGNED_SHORT, 0)
    else:
        #glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)**(If I uncomment this doesnt seem to make any difference?!)**
        #glDrawArrays( GL_TRIANGLES, 0, len(self.triangles) );  
        glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)**(What does it draw now since GL_ELEMENT_ARRAY_BUFFER_ARB is binded to 0 ?!)**

现在,glDrawArrays可以工作了.但是在我必须绘制三角形的情况下,它不会绘制在bufferTriangles中定义的三角形(这是我所读的内容,这是正常的,因为drawArrays不使用索引?或者我在这里错了吗?).问题是,如果我尝试使用glDrawElements,一切都会崩溃:

Exception Type:  EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x000000003150ebbc
Crashed Thread:  0

Thread 0 Crashed:
0   com.apple.GeForce8xxxGLDriver   0x1a3e7050 gldGetTextureLevel + 743600
1   com.apple.GeForce8xxxGLDriver   0x1a3e7563 gldGetTextureLevel + 744899
2   GLEngine                        0x1a206eee gleDrawArraysOrElements_VBO_Exec + 1950

现在我在这里想念什么?据我了解,我可能在某个地方传递了一个错误的指针?请注意,即使我尝试使用glDrawElements(type,24,GL_UNSIGNED_INT,0),即使定义的三角形数量大得多,它仍然会崩溃,因此我认为它与大小无关.

关于, 博格丹

好了,现在我做了一些额外的检查,这是我当前的情况:我已经将len(triangles)更改为ADT.byteCount,尚无解决方案.因此,我检查了所获取的所有数据,就像这样:vertices数组包含GL_Float类型的〜60000 * 3 = 180000个顶点条目,而normals数组也是如此.由于只有< 62535顶点我正在使用三角形的无符号缩写.所以我的len(三角形)是〜135000.我还更改了glDrawElements(GL_TRIANGLES,len(self.triangles), GL_UNSIGNED_SHORT ,0).还检查了三角形数组中的所有数据,它们的取值范围是0到62534.在考虑也许某个超出范围的指数滑落了低谷.还有什么可能是错的? 哦,glDrawElements(GL_POINTS,...)如何工作?还需要某种索引吗?

EDIT2 我已经更新了上面的代码,并且在那里说过,现在draw元素绘制了我的GL_POINTS,但是我不确定他从哪里获得索引?还是在GL_POINTS情况下不需要它们?对于GL_TRIANGLES来说,它的工作原理是这样的:glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB,self.bufferTriangles)已注释,但是现在将元素缓冲区绑定到0还是需要什么样的索引?另一件事是glDrawElements不会绘制glDrawArrays绘制的所有点.为了更好地解释:

glDrawArrays( GL_POINTS, 0, len(self.vertices) );

这正确地画出了我所有的观点:

glDrawElements(type, len(self.vertices), GL_UNSIGNED_SHORT, 0)

这显然比glDrawArrays绘制的点少得多.现在有趣的事情是,如果我传递10 * len(self.vertices)这样的大小来绘制元素,它将绘制所有的点(有些点可能是两倍或更多;我可以检查吗?),但是它不会崩溃吗?

致谢

EDIT3

有关数组的一些更精确的信息:

顶点-浮点数组,

len(顶点)= 180000 byteCount(vertices)= 720000

triangles-numpy.uint16的数组

len(三角形)= 353439 byteCount(三角形)= 706878 min(三角形)= 0 max(triangles)= 59999,所以它们应该指向有效的顶点

绘制完成:

glDrawElements(GL_TRIANGLES,len(self.triangles),GL_UNSIGNED_SHORT,0)

更新

好吧,当我了解了它应该如何工作时,我试图跳过VBO中的元素,然后就去了:

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, ADT.voidDataPointer(self.triangles))

现在不仅可以正常工作并完美绘制所有三角形,而且FPS更好. VBO不应该更快吗?什么可能导致上述方法起作用,但以下情况崩溃:

glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)
glBufferDataARB(GL_ELEMENT_ARRAY_BUFFER_ARB, ADT.arrayByteCount(triangles), ADT.voidDataPointer(triangles), GL_STATIC_DRAW_ARB)
glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)

解决方案

我没有使用Python GL的经验,但是我认为我发现了一些东西.您在对glDrawElements的调用中使用了len(self.triangles),所以我想这为您提供了三角形数组中的索引数.但是为什么在其他调用中使用len(triangles)作为glBufferData而不是ADT.arrayByteCount的大小.因此您的缓冲区太小了,因为它包含len(triangles)个字节,尽管三角形包含无符号的整数.如果三角形确实包含字节(我对此表示怀疑),则必须在glDrawElements中使用GL_UNSIGNED_BYTE.

编辑:根据您的修改,我得到了更多答案.当然,glDrawElements(GL_POINTS, ...)也需要索引.它仅使用每个索引绘制一个点,而不是使用三角形的每个三个索引.只是对于点,您通常不需要glDrawElements,因为无论如何您都不会重用顶点,但是您仍然需要索引.它并不能神奇地成为glDrawArrays调用.

请记住,vertices数组包含浮点数,并且glDrawArrays绘制顶点,因此您必须绘制len(vertices)/3顶点.突出显示,一个元素是一个索引(具有单个顶点),而不是三角形,并且顶点是3个浮点数(或您在glVertexPointer中指定的浮点数),而不仅仅是一个浮点数.

但是,如果您的triangles数组确实包含3个索引的元组(因此len(triangles)是三角形数而不是索引数),则必须绘制3*len(triangles)元素(索引).并且如果您的vertices数组包含向量而不仅仅是浮点数,则在glDrawArrays调用中绘制len(vertices)顶点是正确的.因此,很高兴能看到他们的声明.

Ok so I'm still struggling to get this to work. The important parts of my code are:

def __init__(self, vertices, normals, triangles):
    self.bufferVertices = glGenBuffersARB(1)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferVertices)
    glBufferDataARB(GL_ARRAY_BUFFER_ARB, ADT.arrayByteCount(vertices), ADT.voidDataPointer(vertices), GL_STATIC_DRAW_ARB)
    self.vertices = vertices
    self.bufferNormals = glGenBuffersARB(1)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferNormals)
    glBufferDataARB(GL_ARRAY_BUFFER_ARB, ADT.arrayByteCount(normals), ADT.voidDataPointer(normals), GL_STATIC_DRAW_ARB)
    self.normals = normals
    self.bufferTriangles = glGenBuffersARB(1)

    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)
    glBufferDataARB(GL_ELEMENT_ARRAY_BUFFER_ARB, ADT.arrayByteCount(triangles), ADT.voidDataPointer(triangles), GL_STATIC_DRAW_ARB)

    self.triangles = triangles
    glDisableClientState(GL_VERTEX_ARRAY) **(Not sure if any of the following influence in any way)** 
    glDisableClientState(GL_NORMAL_ARRAY)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0)
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, 0)

I don't think there is anything wrong here from what I've read so far about VBO's. So now I have my vertex, normals(not used yet) and triangle indices buffers. Now for the actual draw:

def draw(self, type):
    glDisableClientState(GL_VERTEX_ARRAY)  
    glDisableClientState(GL_NORMAL_ARRAY)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0)
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, 0)
    **Again above line not sure if they have any use.**        
    glEnableClientState(GL_VERTEX_ARRAY)         
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferVertices)
    glVertexPointer(3, GL_FLOAT, 0, None)

    glEnableClientState(GL_NORMAL_ARRAY);
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferNormals)
    glNormalPointer(GL_FLOAT, 0, None)

    if type == GL_POINTS:    
        #glDrawArrays( GL_POINTS, 0, len(self.vertices) );    
        glDrawElements(type, len(self.vertices), GL_UNSIGNED_SHORT, 0)
    else:
        #glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)**(If I uncomment this doesnt seem to make any difference?!)**
        #glDrawArrays( GL_TRIANGLES, 0, len(self.triangles) );  
        glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)**(What does it draw now since GL_ELEMENT_ARRAY_BUFFER_ARB is binded to 0 ?!)**

Now the glDrawArrays works. But in the case where I have to draw my triangles it doesn't draw the triangles I have defined in bufferTriangles(this is normal from what I've read since drawArrays doesn't use indices ? Or am I wrong here? ). The problem is that if I try to use the glDrawElements everything crashes with:

Exception Type:  EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x000000003150ebbc
Crashed Thread:  0

Thread 0 Crashed:
0   com.apple.GeForce8xxxGLDriver   0x1a3e7050 gldGetTextureLevel + 743600
1   com.apple.GeForce8xxxGLDriver   0x1a3e7563 gldGetTextureLevel + 744899
2   GLEngine                        0x1a206eee gleDrawArraysOrElements_VBO_Exec + 1950

Now am what am I missing here? From what I can understand I'm probably passing a bad pointer somewhere? Note that even if I try to use glDrawElements(type, 24, GL_UNSIGNED_INT, 0) it still crashes even tho the number of triangles defined is way way larger so I don't think it has anything to do with the size.

Regards, Bogdan

EDIT: Ok so now I've done some extra checking and here is my current situation: I've changed the len(triangles) to ADT.byteCount, no solution yet. So I checked all the data I was getting and it's like this: The vertices array contains ~60000 * 3 = 180000 vertices entries of GL_Float type, as does the normals array. Since there are only < 62535 vertices I'm using unsigned short for the triangles. So I have len(triangles) is ~135000. I've also changed the glDrawElements(GL_TRIANGLES, len(self.triangles), GL_UNSIGNED_SHORT, 0) .I've also checked and all the data from the triangles array is between 0 and 62534, as I was thinking maybe some index that is out of range slipped trough. What else could be wrong here ? Oh and how does glDrawElements(GL_POINTS, ...) work? Does it also need some kind of indices ?

EDIT2 I've updated the code above and as said there, now draw elements draws my GL_POINTS, but I'm not sure where does he gets indices? Or are they not needed in case of GL_POINTS ? And for the GL_TRIANGLES, it works like this , with glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles) commentated, but again what kind of indices does it take here now that the element buffer is binded to 0 ?! And another thing is that glDrawElements will not draw all the points that glDrawArrays does. To better explatin:

glDrawArrays( GL_POINTS, 0, len(self.vertices) );

This draws all my points correctly:

glDrawElements(type, len(self.vertices), GL_UNSIGNED_SHORT, 0)

This seems to visibly draw much fewer points than glDrawArrays. Now the funny thing is that if I pass as size something like 10 * len(self.vertices) to draw elements it will draw all the points(some maybe twice or more ; can I check this? ) but wouldnt it suppose to crash ?

Regards

EDIT3

Some more precise info about the arrays:

vertices - an array of floats,

len(vertices) = 180000 byteCount(vertices) = 720000

triangles - an array of numpy.uint16

len(triangles) = 353439 byteCount(triangles) = 706878 min(triangles) = 0 max(triangles) = 59999 , so they should be pointing to valid vertices

The drawing is done:

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)

UPDATE

Ok just when I tought I got how this should work, I tried to skip the VBO for the elements and went just:

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, ADT.voidDataPointer(self.triangles))

Now not only does this work and draws all my triangles perfectly, but the FPS is better. Shouldn't the VBO be faster? And what could cause the above approach to work but the following to crash:

glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)
glBufferDataARB(GL_ELEMENT_ARRAY_BUFFER_ARB, ADT.arrayByteCount(triangles), ADT.voidDataPointer(triangles), GL_STATIC_DRAW_ARB)
glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)

解决方案

I have no experience with Python GL, but I think I spotted something. You use len(self.triangles) in the call to glDrawElements, so I suppose that gives you the number of indices in the triangles array. But why then using len(triangles) as size in glBufferData and not ADT.arrayByteCount like in the other calls. So your buffer is just too small, as it contains len(triangles) bytes, although triangles contains unsigned ints. If triangles really contains bytes (what I doubt) you would have to use GL_UNSIGNED_BYTE in glDrawElements.

EDIT: According to your edits I got some more answers. Of course glDrawElements(GL_POINTS, ...) needs indices, too. It just uses every index to draw a point, instead of every three indices for a triangle. It's just that for points you often don't need glDrawElements, as you don't reuse vertices anyway, but you still need indices for it. It doesn't magically become a glDrawArrays call under the hood.

And keep in mind, that the vertices array contains floats and glDrawArrays draws vertices, so you have to draw len(vertices)/3 vertices. Juts remember, an element is an index (of a single vertex), not a triangle and a vertex is 3 floats (or what you specified in glVertexPointer), not just one.

But if your triangles array really contains tuples of 3 indices (and therefore len(triangles) is the triangle count and not the index count) you would have to draw 3*len(triangles) elements (indices). and if your vertices array contains vectors and not just floats, then drawing len(vertices) vertices in the glDrawArrays call is correct. It would therefore be nice to see their declarations to be sure.

这篇关于glDrawArrays和glDrawElements的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆