Whats the correct way to handle OpenGL Extensions in Shader Source

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Whats the correct way to handle OpenGL Extensions in Shader Source

Pete
Hello again,

I've got a fragment shader which declares an extension at the top of the file and I'm having trouble loading it.

Fragment shader:

#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES texture;
uniform ..
etc

And I load it in like so:

.. load vertex shader code ..

ShaderCode fragmentCode = ShaderCode.create( gl, GL2ES2.GL_FRAGMENT_SHADER, gl.getClass(), "/", null, filepath, true );
fragmentCode.defaultShaderCustomization( gl, false, true );

new ShaderProgram .. add code .. link & check status

When calling ShaderProgram.link I get the following exception:

(18373): Shader status invalid: 0:6: P0001: Extension directive must occur before any non-preprocessor tokens
(18373): 0:8: L0001: Typename expected, found 'samplerExternalOES'
(18373): exception encountered processing task on thread 'pool-1-thread-3'
(18373): com.jogamp.opengl.GLException: Error Linking Program: ShaderProgram[id=4, linked=false, inUse=false, program: 10,
(18373):    ShaderCode[id=8, type=FRAGMENT_SHADER, valid=false, shader:  11, source]
(18373):    ShaderCode[id=7, type=VERTEX_SHADER, valid=false, shader:  0, source]]

So it looks to me as if calling fragmentCode.defaultShaderCustomization( gl, false, true ); has meant that the extension definition is no longer at the top of the source code, and throws the compilation.

This particular fragment shader is only loaded on an Android device. I need to add default shader precision for Android to be happy, but most of my linux machines don't want it so I add it here dynamically if I find the device requires it.

How should I handle my extension definition in my fragment source to avoid this? Have I gone about this incorrectly?

Thanks for your time.
Reply | Threaded
Open this post in threaded view
|

Re: Whats the correct way to handle OpenGL Extensions in Shader Source

Pete
Indeed the definition has been preceded by the default precision modifiers - here is the source - now dumped whenever a link exception occurs:

W/System.err(18750): --------------------------------------------------------------
W/System.err(18750): Shader #0/1 name 11
W/System.err(18750): --------------------------------------------------------------
W/System.err(18750):    0: // Segment 0/1:
W/System.err(18750):    1:
W/System.err(18750):    2: precision highp float;
W/System.err(18750):    3: precision highp int;
W/System.err(18750):    4: /*precision lowp sampler2D;*/
W/System.err(18750):    5: /*precision lowp samplerCube;*/
W/System.err(18750):    6: #extension GL_OES_EGL_image_external : require
W/System.err(18750):    7:
W/System.err(18750):    8: uniform samplerExternalOES texture;
..
Reply | Threaded
Open this post in threaded view
|

Re: Whats the correct way to handle OpenGL Extensions in Shader Source

elect
In reply to this post by Pete
Hi Pete,

I am used to put them after the glsl version, like here

and I never call defaultShaderCustomization()

I don't know if this may help

Ps: we are looking for help for documentation about jogl and android, could you help by writing a wiki or something similar about an hello triangle sample?
Reply | Threaded
Open this post in threaded view
|

Re: Whats the correct way to handle OpenGL Extensions in Shader Source

Pete
I've confirmed its not happy because the precision specifiers are defined ahead of the extension define, all defines need to precede anything else. GLSL version would be fine as with your example because it is also a define statement, but notice in your example the precision specifiers are after the #define statements, here mine are being added to the head of the file when using ShaderCode.defaultShaderCustomization(..).

For now I'm working around it so I can press on but it'd be nice to know if there is a convention I can follow. All my other shaders don't have any defines or precision modifiers defined in them to make them generic & cross platform / device - I add GLSL version &/or shader precision at runtime based on what device / OS I'm running on when I load the shader in.

The introduction of an extension define has somewhat derailed this method - unless there is an API methodology to help me out here?

p.s. As to your p.s. I'd be happy to contribute anything of worth I can - it'd be nice to give something back :)
Reply | Threaded
Open this post in threaded view
|

Re: Whats the correct way to handle OpenGL Extensions in Shader Source

elect
I am not aware of any convention/methodology for that.

In the worst case it is a bug, we can always do a push and fix that.. but hopefully others more experienced than me will answer you on this

I guess what we really miss is a simple clear wiki page that explain how to set up a minimal android environment using different IDEs (we could start with the one you are using) and then an hello triangle.. what do you think? :)
Reply | Threaded
Open this post in threaded view
|

Re: Whats the correct way to handle OpenGL Extensions in Shader Source

Pete
My starting point for JOGL Android was based on this wiki https://jogamp.org/wiki/index.php/Maven_And_Android as fortunately I was already running an Eclipse / Maven setup at the time, I've moved onto Netbeans now but the setup is all the same under Maven.

Though one issue I've had is that I've had to stick to jogl v2.3.1 as initially upping to v2.3.2 meant my android app no longer had the native libs within it - time restrictions have meant I've not been able to look into the cause of that yet.