Quick note on the setup: This is on a Mac with OS X 10.9.4, running with the Netbeans 8.0 IDE.
Instead of doing the manual compile described in the comments, I created this as a Maven-compiled project. The dependencies are right off the Wiki updated to version 2.2.0 as follows: <dependency> <groupId>org.jogamp.gluegen</groupId> <artifactId>gluegen-rt-main</artifactId> <version>2.2.0</version> </dependency> <dependency> <groupId>org.jogamp.jogl</groupId> <artifactId>jogl-all-main</artifactId> <version>2.2.0</version> </dependency> It seems to compile without any issues. When run, it brings up the window but fails out with the following output: =========Start Output=============== Chosen GLCapabilities: GLCaps[rgba 8/8/8/8, trans-rgba 0x0/0/0/0, accum-rgba 0/0/0/0, dp/st/ms 16/0/0, dbl, mono , hw, GLProfile[GL2ES2/GL4.hw], on-scr[.]] INIT GL IS: jogamp.opengl.gl4.GL4bcImpl GL_VENDOR: NVIDIA Corporation GL_RENDERER: NVIDIA GeForce GT 650M OpenGL Engine GL_VERSION: 4.1 NVIDIA-8.26.26 310.40.45f01 GL3 core detected: explicit add #version 130 to shaders Error compiling the vertex shader: ERROR: 0:1: '' : version '130' is not supported ERROR: 0:2: '' : #version required and missing. Exception in thread "main-AWTAnimator" com.jogamp.opengl.util.AnimatorBase$UncaughtAnimatorException: javax.media.opengl.GLException: Caught ThreadDeath: null on thread main-AWTAnimator at com.jogamp.opengl.util.AWTAnimatorImpl.display(AWTAnimatorImpl.java:84) at com.jogamp.opengl.util.AnimatorBase.display(AnimatorBase.java:446) ===========Truncated Output=============== The exception is a separate problem but it doesn't bother me -- the init() method seems to have called System.exit() while the Animator thread was running. That can be fixed with a little extra code. The problem seems to be the init program attempts to invoke version 130 shader support when it isn't supported. Any suggestion on how to fix this? |
I wonder what shader it tries to compile. The shader in question is probably missing a #version declaration, so JOGL helpfully tries to add it based on what it thinks is correct, which ofc it isn't.
|
This is the code from here: http://jogamp.org/git/?p=jogl-demos.git;a=blob;f=src/demos/es2/RawGL2ES2demo.java;hb=HEAD
The shader starts out like this (I left the line numbers in) 109 private String vertexShaderString = 110 // For GLSL 1 and 1.1 code i highly recomend to not include a 111 // GLSL ES language #version line, GLSL ES section 3.4 112 // Many GPU drivers refuse to compile the shader if #version is different from 113 // the drivers internal GLSL version. 114 // 115 // This demo use GLSL version 1.1 (the implicit version) 116 117 "#if __VERSION__ >= 130\n" + // GLSL 130+ uses in and out 118 " #define attribute in\n" + // instead of attribute and varying 119 " #define varying out\n" + // used by OpenGL 3 core and later. 120 "#endif\n" + 121 122 "#ifdef GL_ES \n" + 123 "precision mediump float; \n" + // Precision Qualifiers 124 "precision mediump int; \n" + // GLSL ES section 4.5.2 125 "#endif \n" + 126 127 "uniform mat4 uniform_Projection; \n" + // Incomming data used by 128 "attribute vec4 attribute_Position; \n" + // the vertex shader 129 "attribute vec4 attribute_Color; \n" + // uniform and attributes 130 131 "varying vec4 varying_Color; \n" + // Outgoing varying data 132 // sent to the fragment shader 133 "void main(void) \n" + 134 "{ \n" + 135 " varying_Color = attribute_Color; \n" + 136 " gl_Position = uniform_Projection * attribute_Position; \n" + 137 "} "; 138 Then before it is used there is this code in the init method 344 if(gl.isGL3core()){ 345 System.out.println("GL3 core detected: explicit add #version 130 to shaders"); 346 vertexShaderString = "#version 130\n"+vertexShaderString; 347 fragmentShaderString = "#version 130\n"+fragmentShaderString; 348 } Since the Mac has a GL3 core the if statement resolves to true. Taking this logic out ends up in a different shader compiler failure which says a version is needed. I guess the question is which one? |
Administrator
|
In reply to this post by jmaasing
On 09/18/2014 08:27 AM, jmaasing [via jogamp] wrote:
> I wonder what shader it tries to compile. The shader in question is probably > missing a #version declaration, so JOGL helpfully tries to add it based on > what it thinks is correct, which ofc it isn't. JOGL's path to determine the GLSL version, gl.getContext().getGLSLVersionString(), is correct. Producing the VersionNumber ctxGLSLVersion: http://jogamp.org/git/?p=jogl.git;a=blob;f=src/jogl/classes/jogamp/opengl/GLContextImpl.java;h=a528c60be38f03aaff27c5bc4321ed22044462c7;hb=HEAD#l1171 http://jogamp.org/git/?p=jogl.git;a=blob;f=src/jogl/classes/javax/media/opengl/GLContext.java;h=6334c35d0f5c65608c608e980702358a08ded9c8;hb=HEAD#l857 Producing the GLSL version directive: http://jogamp.org/git/?p=jogl.git;a=blob;f=src/jogl/classes/javax/media/opengl/GLContext.java;h=6334c35d0f5c65608c608e980702358a08ded9c8;hb=HEAD#l839 Our unit tests probing all profiles testing w/ RedSquareES2 or GearsES2 etc utilizing this string pass on all platforms. ~Sven signature.asc (828 bytes) Download Attachment |
Oh, I didn't explain properly. JOGL will use the correct string for sure. I meant that the shader code may not be adapted to GLSL 1.30 and maybe the shader failed to compile because of #version mismatch. |
I got the demo to run by modifying the code as follows:
if(gl.isGL3core()){ // System.out.println("GL3 core detected: explicit add #version 130 to shaders"); // vertexShaderString = "#version 130\n"+vertexShaderString; // fragmentShaderString = "#version 130\n"+fragmentShaderString; System.out.println("GL3 core detected: explicit add #version 100 to shaders"); vertexShaderString = "#version 100\n"+vertexShaderString; fragmentShaderString = "#version 100\n"+fragmentShaderString; }So the SL compiler I got in the setup mentioned above seems to understand "#version 100" If I learn anything more I'll post it, but the original code should have some regression modification to it to cover this hardware. BTW when the program terminates it still throws "thread death" exceptions. |
I'm seeing #version 130 unsupported errors in my own program on my Mac.
I can't use #version 100 because it's too low for what I'm trying to do with the shader. I tried using #version 330 instead and got "ERROR: 0:2: 'attribute' : syntax error: syntax error" So I'm guessing I'll have to update my shader code to 330 format, but shouldn't version 130 work in GL4? Or is there something I misunderstand? |
I see a lot of places saying that GLSL version 130 is not supported on OSX with Legacy profile. This makes me wonder if JOGL isn't successfully initializing a core profile, despite the bugfix for it?
|
Well, fortunately I did manage to get it working with #version 330 after I refactored away deprecated GLSL features. So feel free to ignore my questions about #version 130
|
Free forum by Nabble | Edit this page |