How to skip GDI renderer

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

How to skip GDI renderer

gouessej
Administrator
Hi

I use the following example I posted on Wikipedia:
http://en.wikipedia.org/wiki/Java_OpenGL#Code_example

When I replace the AWT GLCanvas by a GLJPanel, the Nvidia driver is used under Windows but if I use the AWT GLCanvas, it uses "GDI renderer". When I use GLProfile.getMaxFixedFunc(true) or GLProfile.getDefault(), I get the same result. I try to use several flags of GLCapabilities but I don't succeed in skipping the GDI renderer. How can I do that? Shouldn't GLProfile.getMaxFixedFunc(true) allow me to favor hardware implementation? Is it a bug of JOGL or should I override DefaultGLCapabilitiesChooser?
Julien Gouesse | Personal blog | Website
Reply | Threaded
Open this post in threaded view
|

Re: How to skip GDI renderer

Sven Gothel
Administrator
On 01/22/2013 03:23 PM, gouessej [via jogamp] wrote:
> Hi
>
> I use the following example I posted on Wikipedia:
> http://en.wikipedia.org/wiki/Java_OpenGL#Code_example
>
> When I replace the AWT GLCanvas by a GLJPanel, the Nvidia driver is used under
> Windows but if I use the AWT GLCanvas, it uses "GDI renderer".

> When I use
> GLProfile.getMaxFixedFunc(true) or GLProfile.getDefault(), I get the same
> result. I try to use several flags of GLCapabilities but I don't succeed in
> skipping the GDI renderer. How can I do that? Shouldn't
> GLProfile.getMaxFixedFunc(true) allow me to favor hardware implementation? Is
> it a bug of JOGL or should I override DefaultGLCapabilitiesChooser?

I tested the wiki demp and added GL version info:

(09:40:28 PM) sgothel: GL_VENDOR: NVIDIA Corporation
(09:40:28 PM) sgothel: GL_RENDERER: GeForce GTX 460/PCIe/SSE2
(09:40:28 PM) sgothel: GL_VERSION: 4.2.0
(09:40:28 PM) sgothel: GL GLSL: true, has-compiler: true, version 4.20 NVIDIA via Cg compiler, 4.20.0
(09:40:28 PM) sgothel: GL FBO: basic true, full true
(09:40:28 PM) sgothel: GL Profile: GLProfile[GL4bc/GL4bc.hw]
(09:40:28 PM) sgothel: GL Renderer Quirks:[NoDoubleBufferedBitmap]
(09:40:28 PM) sgothel: GL:jogamp.opengl.gl4.GL4bcImpl@4d3c7378, 4.2 (Compatibility profile, arb, ES2 compatible, FBO, hardware) - 4.2.0

I also used gluegen_624-joal_389-jogl_896-jocl_735 inflated 7z,
and ran etc\test.bat:
                GL4bc   true [4.2 (Compatibility profile, arb, ES2 compatible, FBO, hardware)]
                GL4     true [4.2 (Core profile, arb, ES2 compatible, FBO, hardware)]
                GL3bc   true [4.2 (Compatibility profile, arb, ES2 compatible, FBO, hardware)]
                GL3     true [4.2 (Core profile, arb, ES2 compatible, FBO, hardware)]
                GL2     true [4.2 (Compatibility profile, arb, ES2 compatible, FBO, hardware)]
                GL2ES1  true
                GLES1   false
                GL2ES2  true
                GLES2   false

(Details .. see attachment)

+++

It is possible to select SW via GLProfile.get(..)

  <http://jogamp.org/deployment/jogamp-next/javadoc/jogl/javadoc/javax/media/opengl/GLProfile.html#get%28javax.media.nativewindow.AbstractGraphicsDevice,%20java.lang.String[],%20boolean%29>
  <http://jogamp.org/deployment/jogamp-next/javadoc/jogl/javadoc/javax/media/opengl/GLProfile.html#get%28java.lang.String[],%20boolean%29>

_iff_ any of those detected profile mappings is actually a SW one, i.e. no HW profile available.

I only have seen SW being selected in case HW is available _if_ using a SW-ONLY pixelformat (GLCaps),
like BITMAP (offscreen, !FBO && !PBuffer).

+++

So .. something seems to be odd w/ NV driver installation,
or maybe you are using a remote desktop solution ?
In case of the latter .. Windows RDC always uses GDI (SW GL),
so pls switch to VNC here (turbo-vnc for example).

?

~Sven


test-win64-nv.log (106K) Download Attachment
signature.asc (911 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: How to skip GDI renderer

gouessej
Administrator
Sorry it happens only when I query a GLCapabilities object with depth bits = 32 even though I set hardwareAccelerated to true. Is it the expected behaviour?
Julien Gouesse | Personal blog | Website
Reply | Threaded
Open this post in threaded view
|

Re: How to skip GDI renderer

Sven Gothel
Administrator
On 01/22/2013 09:56 PM, gouessej [via jogamp] wrote:
> Sorry it happens only when I query a GLCapabilities object with depth bits =
> 32 even though I set hardwareAccelerated to true. Is it the expected behaviour?

Yes, if the GL driver doesn't support them.

I.e. the only 32 bit depth I see in the log is like:
  GLCaps[wgl vid 0x63 arb: rgba 0x8/8/8/0, opaque, accum-rgba 16/16/16/0, dp/st/ms: 32/8/0, one, mono  , sw, GLProfile[GL4bc/GL4bc.hw], on-scr[fbo, bitmap]]

IMHO 16/24 bit depth is standard .. more hardly needed. If so - well, there might be support by FBO.
But such huge z-buffer comparison, if needed, would imply your huge world renderer
uses immediate mode something, without any culling etc (-> deferred rendering).
The latter is not a good idea and hardly used.

~Sven



signature.asc (911 bytes) Download Attachment