Re: How to enable quad-buffered stereo rendering
Posted by
Michael Bien on
Jul 04, 2010; 6:22pm
URL: https://forum.jogamp.org/How-to-enable-quad-buffered-stereo-rendering-tp938684p942730.html
could you elaborate a bit on quad buffered stereo?
quadbuffer = plain old double buffer + stereo?
Sure we would like to support it somehow but the first step is to
figure out whether stereo is something else as the quad buffered thingy.
in case stereo != quad buffered stereo
Does nvidia provide sample code in C? Context creation would be
interesting for us. In best case its just a bug in the Capabilites
choosing code and you would be served by just setting stereo to true.
regards,
michael
On 07/02/2010 04:52 PM, michael.nischt [via jogamp] wrote:
Hi there,
I have an 120Hz screen and the nvidia 3D vision kit plugged into my
computer running Ubuntu (Linux-amd64) with a GeForce Quadro FX 4800 and
want to use quad-buffered stereo rendering in jogl2. Unfortunately, I
don't see how to enable it.
I set GLCapabilties.setStereo(true), but the flag is false in the
chosen caps of the GLDrawable - a GLCanvas in my case. Actually, using
a GLCapabiltiyChooser, none of the available modes has the stereo flag
set to true. Accordringly a later query to glGetBooleanv(GL_STEREO, ..)
returns GL_FALSE.
However, my system configuration is perfectly capable of it. For
example these programs written in PyOpenGL work perfectly fine:
http://www.geeks3d.com/20090814/stereoscopic-camera-in-opengl-using-pyopengl/
I added the glGetBooleanv(GL_STEREO, ..) to these scripts and the
returned value is GL_TRUE.
Can anyone help my enable stereo with JOGL2?
Thanks and Best,
Michael
--
http://michael-bien.com/