Where is glVertexAttribIPointer that doesn't take a Buffer?

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Where is glVertexAttribIPointer that doesn't take a Buffer?

Jeff Quigley
Hi, I was running
gl.glVertexAttribPointer(attributeLocation, 2, GL.GL_UNSIGNED_INT, false, stride, 0)
and it refused to work until I switched to GL.GL_FLOAT which seems very weird, and if anybody has an explanation for that then I'd be glad to hear it, but my main question is something else:  In the process of fiddling with this I learned about glVertexAttribIPointer() which seemed like something I ought to try, since I was passing ints and that function seems to be specialized for ints or something.  But the interfaces GL4 and GL3 seem to lack glVertexAttribIPointer(int, int, int, int, long).  I'm using a GL3, which should have that method.  What's the problem?
Reply | Threaded
Open this post in threaded view
|

Re: Where is glVertexAttribIPointer that doesn't take a Buffer?

Sven Gothel
Administrator
On 09/03/2012 12:13 AM, Jeff Quigley [via jogamp] wrote:
> Hi, I was running
> gl.glVertexAttribPointer(attributeLocation, 2, GL.GL_UNSIGNED_INT, false,
> stride, 0)
> and it refused to work until I switched to GL.GL_FLOAT which seems very weird,
> and if anybody has an explanation for that then I'd be glad to hear it,

What is the phenomenon .. i.e. how you determine it's refused?
No VBO buffer bound Exception ?

> but my
> main question is something else:  In the process of fiddling with this I
> learned about glVertexAttribIPointer() which seemed like something I ought to
> try, since I was passing ints and that function seems to be specialized for
> ints or something.  But the interfaces GL4 and GL3 seem to lack
> glVertexAttribIPointer(int, int, int, int, long).  I'm using a GL3, which
> should have that method.  What's the problem?

I see, the code generation for this method doesn't match the GL3 spec,
where (both) glVertexAttribIPointer(..)'s and glVertexAttribPointer(..)'s
'void *pointer' argument is an offset to a bound VBO (GL_ARRAY_BUFFER).

GL2 spec (and the GL2/GL3 compatibility profile) define
glVertexAttribPointer(..)'s 'void *pointer' argument
either as a direct data source, _or_ as an offset if a VBO (GL_ARRAY_BUFFER) is bound.

I will earmark this, maybe you can copy/paste this email into a new bug report ?
Thank you.

+++

Interface GL2GL3:
build/jogl/gensrc/classes/javax/media/opengl/GL2GL3.java:  /** Entry point to C language function: <code> void {@native glVertexAttribIPointer}(GLuint index, GLint size, GLenum type, GLsizei stride, const GLvoid *  pointer); </code> <br>Part of <code>GL_VERSION_3_0</code>                                                                    
build/jogl/gensrc/classes/javax/media/opengl/GL2GL3.java:  public void glVertexAttribIPointer(int index, int size, int type, int stride, Buffer pointer);

Implementation (For all desktop profiles):
build/jogl/gensrc/classes/jogamp/opengl/gl4/GL4bcImpl.java:  /** Entry point to C language function: <code> void {@native glVertexAttribIPointer}(GLuint index, GLint size, GLenum type, GLsizei stride, const GLvoid *  pointer); </code> <br>Part of <code>GL_VERSION_3_0</code>
build/jogl/gensrc/classes/jogamp/opengl/gl4/GL4bcImpl.java:  public void glVertexAttribIPointer(int index, int size, int type, int stride, Buffer pointer)  { ..

+++

~Sven



signature.asc (907 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Where is glVertexAttribIPointer that doesn't take a Buffer?

Jeff Quigley
Okay, done.

I already found a workaround for my main problem, so I don't need you to answer my other question.  But if you are interested, then read on.  :)

>What is the phenomenon .. i.e. how you determine it's refused?
>No VBO buffer bound Exception ?

Nothing crashes or reports errors.  The GL accepts my int[] full of attribute data, and then in the vertex shader, the ivec2 that's supposed to take its two values from the array ends up with a weirdly altered version of those values.  0 in the array becomes 0 in the vector, but other values get their bits shifted around strangely.  I tried putting the number 0b11110111011 in the array, and when it reached the shader its bits were something like 01000100111101110110000000000000--note the many extra 0's at the right end, and the two extra 1's near the left.  I thought maybe endianness was the problem but when I told my IntBuffer to use the opposite endianness the problem persisted.  I figured I just had some offsets wrong somewhere, but in that case, telling glVertexAttribPointer() that I was sending GL_FLOATs should not have worked, right?  I'm mystified that I can send ints while telling it I'm sending floats and end up with the correct ints in the shader.  It only sees a Buffer right, not an IntBuffer or a FloatBuffer?  So I would expect that if I sent ints and said I was sending floats then the library would interpret the int bits as floats and produce some strange floats which maybe would then get casted back to ints so they could fit into the ivec2...  or else it should just crash?  I don't understand why GL_FLOAT works and I don't understand why GL_INT and GL_UNSIGNED_INT did those weird shiftings.
Reply | Threaded
Open this post in threaded view
|

Re: Where is glVertexAttribIPointer that doesn't take a Buffer?

Jeff Quigley
OH WOW I just realized that 01000100111101110110000000000000 is the bits for the float that's equal to 11110111011!  My mind is blown.
Reply | Threaded
Open this post in threaded view
|

Re: Where is glVertexAttribIPointer that doesn't take a Buffer?

Jeff Quigley
To make it clearer:

// in the shader:
in ivec2 someAttribute;

// in Java:
int[] attributeData = [a lot of ints];
int attributeLocation = gl.glGetAttribLocation(shaderProgram, "someAttribute");
gl.glVertexAttribPointer(attributeLocation, 2, GL.GL_FLOAT, false, stride, 0);
gl.glBufferData(GL.GL_ARRAY_BUFFER, count, IntBuffer.wrap(attributeData), GL2.GL_STREAM_DRAW);


^^ This works correctly.  I provide ints, I say I'm providing floats, and I end up with the right ivec2.  If I say I'm providing ints:
gl.glVertexAttribPointer(attributeLocation, 2, GL.GL_UNSIGNED_INT, false, stride, 0);
or
gl.glVertexAttribPointer(attributeLocation, 2, GL3.GL_INT, false, stride, 0);

then it does not work correctly.  The ivec2 (I also tried uvec2, with the same result) is supposed to contain (x, y) but instead it apparently contains (floatBitsToInt(x), floatBitsToInt(y)).