4 Separate Laptops, all running Ubuntu 11.04.
One with an ATI Card, running GLX 1.4, OpenGl 3.3, GLSL 3.3, runs as expected. One with an Nvidia Card, running GLX 1.4, OpenGL 3.3, GLSL 3.3, shows no effects of the shader. Another with an Nvidia Card, running GLX 1.4, OpenGL 4.1, GLSL 4.1 runs as expected, HOWEVER there is an identical laptop that displays the same symptoms as the above laptop with no effects. Briefly, what I'm trying to do is take images, find the parts that are blue enough to consider them oceans, and brighten those pixels because the oceans are too dark. The package I'm using(NASA WorldWind) is written in JAVA, using JOGL, so I'm using JOGL to use a custom fragment and vertex shader to create this effect. There are a couple of strange things happening that I believe may be signs of what is is that is causing the lack of shading. First, if the fragment shader is changed to just color all pixels red without reason, the working machines will display a red sphere. The machines that don't work, flash the red sphere for one draw, and then the map reverts back to the original images(the whole globe can be seen, and the oceans are dark). Second, if a change is made to the java package such that the rendering loop doesn't call glUseProgram(0), the working machines render strangely, but still close, and the machines that are not working will have a translucent sphere that is whatever color I set GL_fragcolor to in the shader. This is made even more strange by the fact that it is ignoring the discared pixels and shading them this color anyways. Third, most of the debugging information I've gotten from turning jogl debugging on is trash, but on the ATI machine I can see it retrieving a power-of-two texture quite often, while on the nvidia machine that doesn't have an identical pair, the power-of-two texture comes up once or twice at the beginning of runtime. The shaders are receiving a uniform sampler2D from the JOGL code, and using it to create a texture2D that the color of the pixel is retrieved through. Fragment Shader: uniform sampler2D tile_image; uniform float brightness; const vec3 coef = vec3(0.2125, 0.7154, 0.0721); uniform float saturation; uniform vec4 hueToAdjust; vec4 shadeTile(vec4 tile_val); //Fragment shader. Colors every coordinate that is mostly blue to a lighter blue. void main (void) { if (gl_TexCoord[0].s < 0 || gl_TexCoord[0].s > 1) discard; if (gl_TexCoord[0].t < 0 || gl_TexCoord[0].t > 1) discard; vec4 tile_val = texture2D(tile_image, gl_TexCoord[0].st); //this if statement catches the majority of the ocean areas if(tile_val.b >= (tile_val.g+tile_val.r)) { tile_val = shadeTile(tile_val); { gl_FragColor = vec4 (tile_val.rgba); } vec4 shadeTile(vec4 tile_val) { //saturation vec4 intensity = vec4(dot(tile_val.rgb,coef)); tile_val = mix(intensity, tile_val, saturation); //contrast tile_val = brightness * (1.0 - saturation) + tile_val; //hue adjust tile_val.rgba *= hueToAdjust; //brightness tile_val.rgba *= brightness; return tile_val; } Vertex Shader: void main(void) { gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0; gl_TexCoord[1] = gl_TextureMatrix[0] * gl_MultiTexCoord1; gl_Position = ftransform(); } Render Loop: Public void render(DrawContext dc) { if (d_doRender) { GL gl = dc.getGL(); if (d_glsl != null && d_glsl.isShaderSupported()) { d_glsl.useShaders(); d_glsl.startShader(); gl.glUniform1i(d_glsl.getUniformLocation("tile_image"), 0); gl.glUniform1f(d_glsl.getUniformLocation("saturation"), d_saturation); gl.glUniform4f(d_glsl.getUniformLocation("hueToAdjust"), d_hueToAdjust[0], d_hueToAdjust[1], d_hueToAdjust[2], d_hueToAdjust[3]); gl.glUniform1f(d_glsl.getUniformLocation("brightness"), d_brightness); } super.render(dc); if (d_glsl != null && d_glsl.isShaderSupported()) { d_glsl.endShader(); } } } It is also of note that the program and shader infologs give no information on the machines that are not working, while on the ATI machine they say that the shaders were successfully compiled to run on this hardware. |
Administrator
|
On 04/02/2012 06:59 PM, Dwight [via jogamp] wrote:
> > > 4 Separate Laptops, all running Ubuntu 11.04. > > One with an ATI Card, running GLX 1.4, OpenGl 3.3, GLSL 3.3, runs as > expected. > > One with an Nvidia Card, running GLX 1.4, OpenGL 3.3, GLSL 3.3, shows no > effects of the shader. > > Another with an Nvidia Card, running GLX 1.4, OpenGL 4.1, GLSL 4.1 runs as > expected, HOWEVER there is an identical laptop that displays the same > symptoms as the above laptop with no effects. you shall post the complete driver version, as well as a complete unit test case - so we can reproduce. > > Briefly, what I'm trying to do is take images, find the parts that are blue > enough to consider them oceans, and brighten those pixels because the oceans > are too dark. The package I'm using(NASA WorldWind) is written in JAVA, > using JOGL, so I'm using JOGL to use a custom fragment and vertex shader to > create this effect. BTW .. good news, Xerxes started to port WWJ to JOGL2. He and I work together on getting this stuff done in a while. After you have send a small unit test and I have more time at hand, I will read your email in more detail .. sorry for the premature reply. ~Sven > > There are a couple of strange things happening that I believe may be signs > of what is is that is causing the lack of shading. > > First, if the fragment shader is changed to just color all pixels red > without reason, the working machines will display a red sphere. The machines > that don't work, flash the red sphere for one draw, and then the map reverts > back to the original images(the whole globe can be seen, and the oceans are > dark). > > Second, if a change is made to the java package such that the rendering loop > doesn't call glUseProgram(0), the working machines render strangely, but > still close, and the machines that are not working will have a translucent > sphere that is whatever color I set GL_fragcolor to in the shader. This is > made even more strange by the fact that it is ignoring the discared pixels > and shading them this color anyways. > > Third, most of the debugging information I've gotten from turning jogl > debugging on is trash, but on the ATI machine I can see it retrieving a > power-of-two texture quite often, while on the nvidia machine that doesn't > have an identical pair, the power-of-two texture comes up once or twice at > the beginning of runtime. > > The shaders are receiving a uniform sampler2D from the JOGL code, and using > it to create a texture2D that the color of the pixel is retrieved > through.<br></br><br></br>Fragment Shader > <br></br> > > uniform sampler2D tile_image; > uniform float brightness; > const vec3 coef = vec3(0.2125, 0.7154, 0.0721); > uniform float saturation; > uniform vec4 hueToAdjust; > > vec4 shadeTile(vec4 tile_val); > > //Fragment shader. Colors every coordinate that is mostly blue to a > lighter blue. > void main (void) > { > > if (gl_TexCoord[0].s < 0 || gl_TexCoord[0].s > 1) discard; > if (gl_TexCoord[0].t < 0 || gl_TexCoord[0].t > 1) discard; > vec4 tile_val = texture2D(tile_image, gl_TexCoord[0].st); > > //this if statement catches the majority of the ocean areas > if(tile_val.b >= (tile_val.g+tile_val.r)) > { > tile_val = shadeTile(tile_val); > { > gl_FragColor = vec4 (tile_val.rgba); > } > > vec4 shadeTile(vec4 tile_val) > { > //saturation > vec4 intensity = vec4(dot(tile_val.rgb,coef)); > tile_val = mix(intensity, tile_val, saturation); > > //contrast > tile_val = brightness * (1.0 - saturation) + tile_val; > > //hue adjust > tile_val.rgba *= hueToAdjust; > > //brightness > tile_val.rgba *= brightness; > > > return tile_val; > } > > Vertex Shader: > > void main(void) > { > > gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0; > gl_TexCoord[1] = gl_TextureMatrix[0] * gl_MultiTexCoord1; > gl_Position = ftransform(); > } > > Render Loop: > > Public void render(DrawContext dc) > { > if (d_doRender) > { > GL gl = dc.getGL(); > if (d_glsl != null && d_glsl.isShaderSupported()) > { > d_glsl.useShaders(); > d_glsl.startShader(); > gl.glUniform1i(d_glsl.getUniformLocation("tile_image"), 0); > gl.glUniform1f(d_glsl.getUniformLocation("saturation"), > d_saturation); > gl.glUniform4f(d_glsl.getUniformLocation("hueToAdjust"), > d_hueToAdjust[0], d_hueToAdjust[1], > d_hueToAdjust[2], d_hueToAdjust[3]); > gl.glUniform1f(d_glsl.getUniformLocation("brightness"), > d_brightness); > } > super.render(dc); > if (d_glsl != null && d_glsl.isShaderSupported()) > { > d_glsl.endShader(); > } > } > } > > > It is also of note that the program and shader infologs give no information > on the machines that are not working, while on the ATI machine they say that > the shaders were successfully compiled to run on this hardware. signature.asc (910 bytes) Download Attachment |
The driver version of the laptop running OGL 3.3 is 295.33, an Nvidia Quadro NVS 140m
The driver version of the laptops running OGL 4.1 is 270.41.06, both NVS 4200m cards As far as a unit test goes, I don't know how I could write one for this. |
Free forum by Nabble | Edit this page |