Binding multiple textures using the jogamp Texture class
Posted by Zack on
URL: https://forum.jogamp.org/Binding-multiple-textures-using-the-jogamp-Texture-class-tp4032717.html
Hey all,
I'm working on an engine to render objects from a JSON file. It was originally exported from Blender, but it's been run through a few scripts and is in a custom format. However, I've run into a bit of a roadblock - I apologize for the lengthy post, but any pointers or advice would be greatly, greatly appreciated!
Essentially, I have a vertex buffer and an elements index, and I'm rendering it with generic vertex attributes. I first run
gl.glGenBuffers(numberOfAttributeBuffers, attributeAddresses, offset);
and then set up each buffer like so:
{ // Bind Vertex Buffer
verticesAddress = attributeAddresses[0];
gl.glBindBuffer(GL2.GL_ARRAY_BUFFER,
verticesAddress);
gl.glBufferData(GL2.GL_ARRAY_BUFFER,
g.vertexBuffer.capacity() * Buffers.SIZEOF_FLOAT,
g.vertexBuffer,
GL2.GL_STATIC_DRAW);
gl.glEnableVertexAttribArray(vertexSlot);
gl.glVertexAttribPointer(vertexSlot,
numCoordsPerVertex,
GL2.GL_FLOAT,
normalized,
stride,
offset);
}
{ // Bind Index Array
indexAddress = attributeAddresses[2];
gl.glBindBuffer(GL2.GL_ELEMENT_ARRAY_BUFFER,
indexAddress);
gl.glBufferData(GL2.GL_ELEMENT_ARRAY_BUFFER,
g.indexBuffer.capacity() * Buffers.SIZEOF_INT,
g.indexBuffer,
GL2.GL_STATIC_DRAW);
}
{ // Bind UVs
uvAddress = attributeAddresses[3];
gl.glBindBuffer(GL2.GL_ARRAY_BUFFER, uvAddress);
gl.glBufferData(GL2.GL_ARRAY_BUFFER,
g.uvBuffer.capacity() * Buffers.SIZEOF_FLOAT,
g.uvBuffer,
GL2.GL_STATIC_DRAW);
gl.glEnableVertexAttribArray(uvSlot);
gl.glVertexAttribPointer(uvSlot,
2,
GL2.GL_FLOAT,
normalized,
stride,
offset);
}
After compiling and linking the vertex and fragment shaders, I get the locations of each attribute with
vertexSlot = gl.glGetAttribLocation(program, "in_Vertex");
uvSlot = gl.glGetAttribLocation(program, "in_UV");
At this point, I am able to render the shape of the object just fine - my vertex/fragment shaders are just pass-throughs. My render loop is basically just this:
gl.glEnableVertexAttribArray(vertexSlot);
gl.glEnableVertexAttribArray(uvSlot);
gl.glDrawElements(GL2.GL_TRIANGLES,
g.indexBuffer.capacity(),
GL2.GL_UNSIGNED_INT,
g.indexBuffer
);
The trouble I'm running into is mapping textures to the objects using UVs. Essentially, I have one UV array, where each UV is preceded by which texture it corresponds to. For example, the textures are named
texture0.jpg
texture1.jpg
texture2.jpg
texture3.jpg
and entries in the UV array might be "
2, .0834, .0271, ..."
where the 2 denotes that it is a mapping for texture2, and the following float are the UVs.
Reading the textures in is no problem - I just make a Texture array and loop through it:
import com.jogamp.opengl.util.texture.Texture;
Texture[] textures = new Texture[4];
for (int k = 0; k < numTextures; k++) {
String fileName = "texture" + k;
textures[k] = JoglTools.getTextureFromFile(fileName, ".jpg");
gl.glActiveTexture(GL2.GL_TEXTURE0 + k);
textures[k].enable(gl);
textures[k].bind(gl);
textures[k].setTexParameteri(gl, GL2.GL_TEXTURE_WRAP_S, GL2.GL_CLAMP_TO_EDGE);
textures[k].setTexParameteri(gl, GL2.GL_TEXTURE_WRAP_T, GL2.GL_CLAMP_TO_EDGE);
textures[k].setTexParameteri(gl, GL2.GL_TEXTURE_MIN_FILTER, GL2.GL_LINEAR);
textures[k].setTexParameteri(gl, GL2.GL_TEXTURE_MAG_FILTER, GL2.GL_LINEAR_MIPMAP_LINEAR);
}
At this point, however, I'm not sure how I go about rendering these textures onto the object I'm drawing. It seems like I would need to handle the texture sampling in the fragment shader by passing the UVs and texture through. Mine currently looks like this:
"uniform sampler2D sampler;" +
"varying vec3 out_Color;" +
"varying vec2 out_UV;" +
"void main() {" +
"vec4 texture = texture2D(sampler, out_UV);" +
"gl_FragColor = vec4(out_Color, 0.85);" +
"}"
It seems like this would be easy with one texture - everything can be sent through in one pass. However, I am having a hard time figuring out how this would work with 4 textures. My primary question is - how does binding/enabling a texture interact with the shaders? If I only have one sampler2D uniform, will that simply take on the value of whichever texture is enabled/bound when draw is called?
Would it be better to create 4 separate uniforms - one for each texture? Or to instead separate the vertex, element index, and UV arrays into 4 separate buffers, and render them all individually?
Thanks for reading all the way through, and if you guys have any ideas, pointers, or even directions I might take to test things myself, I'd really appreciate the help!