|
Greetings everyone,
after successful indexed rendering from an interleaved (position/normal) vertex buffer, I've been trying to render using vertex array object.
My initialization method looks like this:
- request 1 Vertex Array handle from OpenGL (glGenVertexArrays)
- bind the obtained handle: glBindVertexArray(vaoHandle)
-- request 2 buffer handles from OpenGL (glGenBuffers)
-- bind 1st buffer handle as array buffer: glBindBuffer(GL_ARRAY_BUFFER,vboHandle)
--- enable 3 vertex attribute arrays (0,1,2)
--- bind interleaved buffer data (glBufferData)
---- set position pointer: glVertexAttribPointer(0, 3, GL_FLOAT, false, 24, 0)
---- set normal pointer: glVertexAttribPointer(1, 3, GL_FLOAT, false, 24,12)
-- bind 2nd buffer handle as element array buffer: glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, iboHandle)
---- bind index buffer data (glBufferData)
---- set index pointer: glVertexAttribPointer(2, 1, GL_UNSIGNED_INT, false, 0,0)
--- disable 3 vertex attribute arrays (2,1,0)
Render attempt, using VAO, looks like this:
- bind vao: glBindVertexArray(vaoHandle)
- enable 3 vertex attribute arrays (0,1,2)
-- make a draw call: glDrawElements(GL_TRIANGLES, triangleCount, GL_UNSIGNED_INT,0)
- disable 3 vertex attribute arrays (2,1,0)
- bind default vertex array: glBindVertexArray(0)
Note: In my successful rendering routine I rely on glEnable/DisableClientState, glVertexPointer & glNormalPointer, after binding the buffers.
I would appreciate any, and all, feedback and suggestions.
P.S. I apologize for lack of formatting.
|