Login  Register

Re: Need Help Solving OpenGL/GLSL 4.4 Issues

Posted by xghost on Jul 27, 2014; 12:27am
URL: https://forum.jogamp.org/Need-Help-Solving-OpenGL-GLSL-4-4-Issues-tp4032557p4032660.html

jmaasing wrote
please let us know if you figure it out :-) I'm betting it's a simple oversight or some simple thing that is done in a different order.
So I finally had more time to spend on this and I've figured out where the problem was. First, I'll just display the diff between what was not working and what is working, and then I'll explain where this actually originated.

First, the diff:

diff --git a/src/ray/cg/demos/demo4/Demo4.java b/src/ray/cg/demos/demo4/Demo4.java
index a59fb6a..9b2257e 100644
--- a/src/ray/cg/demos/demo4/Demo4.java
+++ b/src/ray/cg/demos/demo4/Demo4.java
@@ -101,7 +101,12 @@ public class Demo4 extends DemoFrame {
                System.out.print("[info] Compiling vertex shader...");

                int vertexShader = gl.glCreateShader(GL4.GL_VERTEX_SHADER);
-               gl.glShaderSource(vertexShader, 1, vertexShaderSource, null);
+               int[] lengths = new int[vertexShaderSource.length];
+               for (int i = 0; i < lengths.length; i++)
+                       lengths[i] = vertexShaderSource[i].length();
+
+               // install the vertex shader source to the shader object
+               gl.glShaderSource(vertexShader, vertexShaderSource.length, vertexShaderSource, lengths, 0);
                gl.glCompileShader(vertexShader);
                checkCompilationErrors(drawable);
                gl.glGetShaderiv(vertexShader, GL4.GL_COMPILE_STATUS, compiledVertShaders, 0);
@@ -115,10 +120,15 @@ public class Demo4 extends DemoFrame {

                System.out.print("[info] Compiling fragment shader...");
                int fragmentShader = gl.glCreateShader(GL4.GL_FRAGMENT_SHADER);
-               gl.glShaderSource(fragmentShader, 1, fragmentShaderSource, null);
+               lengths = new int[fragmentShaderSource.length];
+               for (int i = 0; i < lengths.length; i++)
+                       lengths[i] = fragmentShaderSource[i].length();
+
+               // install the vertex shader source to the shader object
+               gl.glShaderSource(fragmentShader, fragmentShaderSource.length, fragmentShaderSource, lengths, 0);
                gl.glCompileShader(fragmentShader);
                checkCompilationErrors(drawable);
                gl.glGetShaderiv(fragmentShader, GL4.GL_COMPILE_STATUS, compiledFragShaders, 0);

In short, it seems that the following approach was to blame: gl.glShaderSource(shader, 1, shaderSource, null); I had to change that line for the ones that you see above, particularly a different version of the same glShaderSource function: glShaderSource(shader, shaderSource.length, shaderSource, lengths, 0);

This originated by looking at the OpenGL Super Bible 6. In page 19, they have a working C++ example where glShaderSource(shader, 1, shaderSource, NULL) calls are used to provide the source for both the vertex and fragment shaders. Other JOGL-based notes that I was looking at were also following the same approach, based on the same reference and, AFAIK, were working fine for the note's author.

Clearly, the C++ samples do work because I was able to built and run them just fine. The author of the other notes I was checking out (a professor who teaches advanced computer graphics), who was using an older JOGL release and following the same approach based on the same reference, had no problems with this either. (I don't know which release he was using by date, but it must've been available during the Fall of 2013, definitely not the same March 2014 version I'm currently using --see previous posts.)

I'm wondering if something in JOGL changed in more recent releases that allowed this to work before, but not now? (The professor was using Windows, probably a 32-bit version, but not sure.)

I don't really have an explanation for this, but it seems that the previous approach should've worked just as well as the current one.

As shown by the diff, changing that in my demo is enough to make it work as intended, even though the previous approach worked correctly outside, and/or at least in previous releases of, JOGL.

I thought this behavioral difference could've been somewhere in the platform-specific native implementations, but I tested the code in both 64-bit GNU/Linux and 64-bit Windows 7 systems and did not see any difference.

Currently, one difference is the professor's machine was probably a 32-bit rather than a 64-bit system. (I've taken some courses with him in the past, and a laptop he was using was a 32-bit Windows 7 and I saw some JOGL-based demos running in it a few months ago, but I don't know that he was necessarily using the same laptop in 2013.)

Anyway, if anyone has any insight on why I've encountered this difference in the first place, please feel free to share. If this somehow happens to be a previously unknown issue somewhere (e.g. native implementations, something JOGL depends on and not JOGL itself, etc.), then please let me know too :)
 
Thanks.

[EDIT 1: Using the original call as gl.glShaderSource(shader, shaderSource.length, shaderSource, null); makes it work correctly in JOGL, though it's yet unknown why there's a difference between the JOGL-based behavior and the C++ code where the author uses the same call by sending a 1 instead and it still works.]

[EDIT 2: See picture from SB6 example.]