Hello everyone.
I'm currently developing my application on a laptop which uses NVidia Optimus GPU switching between a GeForce GT640M LE and an Intel HD Graphics 4000 card under Windows Server 2012. I just don't know how to setup JOGL in order to use the NVIDIA gpu. My application always defaults to the Intel card. [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL vendor: [Intel] [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL renderer: [Intel(R) HD Graphics 4000] [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL version: [4.0.0 - Build 9.17.10.2932] [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL Shading language version: [4.00 - Build 9.17.10.2932] Doing some research I found out that one could use an NVIDIA specific extension called WGL_NV_gpu_affinity but I'm not sure if JOGL's WGL code is aware of such extension. http://developer.download.nvidia.com/opengl/specs/WGL_nv_gpu_affinity.txt Is it even remotely possible to perform this device selection? Thank you for your time and help! |
Administrator
|
Hi
There is a GUI provided by Nvidia to do that under Windows, just choose the "performance" mode. Martin had the same problem and posted a screen capture but I don't find it :s
Julien Gouesse | Personal blog | Website
|
Thanks for the tip.
Sadly, my code still defaults to the Intel card. Here's a checklist of how I setup things in my laptop under Windows: 1: Power mode is set to High Performance. 2: NVIDIA control panel is configured globally to prefer maximum performance and the use of the NVIDIA GPU over the Intel one. 3: java.exe, javaw.exe and eclipse.exe haven been added to the Applications list, specifying that the high performance graphics processor should be used. 4: Eclipse.exe is started via right click "Run with graphics processor" ... selecting the NVIDIA GPU. But... nothing :P. OpenGL just doesn't seem to care about that configuration. I've noticed that most DirectX games do make use of the NVIDIA GPU (I can see that on the tray menu icon). That's what lead me to believe that I may have to explicitly instruct my application to somehow use that NVIDIA GPU affinity extension. I just don't know how :P. Anyway thanks again for your time and help! |
Administrator
|
In reply to this post by Jesus Zazueta
On 03/25/2013 03:50 PM, datSilencer [via jogamp] wrote:
> Hello everyone. > > I'm currently developing my application on a laptop which uses NVidia Optimus > GPU switching between a GeForce GT640M LE and an Intel HD Graphics 4000 card > under Windows Server 2012. > > I just don't know how to setup JOGL in order to use the NVIDIA gpu. My > application always defaults to the Intel card. > > [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL vendor: [Intel] > [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL renderer: > [Intel(R) HD Graphics 4000] > [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL version: [4.0.0 - > Build 9.17.10.2932] > [main] INFO org.jgl.opengl.GLScheduledEventListener - OpenGL Shading language > version: [4.00 - Build 9.17.10.2932] > > Doing some research I found out that one could use an NVIDIA specific > extension called WGL_NV_gpu_affinity but I'm not sure if JOGL's WGL code is > aware of such extension. > > http://developer.download.nvidia.com/opengl/specs/WGL_nv_gpu_affinity.txt > > Is it even remotely possible to perform this device selection? http://stackoverflow.com/questions/6036292/select-a-graphic-device-in-windows-opengl Above mentions some doubts due to Quadro Pro requirements, but it links a PDF documentation from NV. Maybe you can read it and share your findings, then we could add such impl. to the AbstractGraphicsDevice's unitID. At least the abstraction for this feature is already provided. ~Sven signature.asc (911 bytes) Download Attachment |
In reply to this post by Jesus Zazueta
I encountered a similar problem. I have tried the following but none of them makes optimus working for JOGL:
- Add java/javaw in NVIDIA control panel - Export NvOptimusEnablement (by a custom JVM launcher) - Link nvapi, opencl and cuda It only happens to JOGL, not JavaFX or OpenGL Extensions Viewer. JavaFX demos running on the same version/platform of Java 8 do utilize Optimus. Are there anyone else who experience the same problem?? |
Administrator
|
Hi
JavaFX doesn't use OpenGL by default under Windows except if you build its ES2 pipeline and force its use by yourself.
Julien Gouesse | Personal blog | Website
|
This post was updated on .
I was wondering if Java.exe is blocked from using nvidia optimus, but it's not. Also I could force JOGL to use optimus if I start a javaFX app first and then create JOGL canvas from it, but the performance seems the same (as Intel HD).
EDIT: I confirmed that it does work if I start JavaFX before JOGL in the same app. My app is based on Eclipse RCP, so I have to start a view containing JavaFX scene (no content) to activate nVIDIA Optimus, then start windows containing JOGL canvas, which will then be rendered by Optimus while the JavaFX view is visible (closing JavaFX Scene will turn off Optimus immediately). The performance is correct, though vertical sync seems to be enforced inside the app, hence the low FPS. |
A new workaround: custom launcher written in .NET 4 and calling realtech-vr OpenGL Extension Viewer's infogl.dll before entering java main().
Weird I know. It works, but calling the dll directly from JNI/JNA doesn't. The sample code is at http://stackoverflow.com/questions/25367851/activate-nvidia-optimus-under-java-opengl |
The official solution:
http://docs.nvidia.com/gameworks/content/technologies/desktop/optimus.htm "Global Variable NvOptimusEnablement (new in Driver Release 302) Starting with the Release 302 drivers, application developers can direct the Optimus driver at runtime to use the High Performance Graphics to render any application —- even those applications for which there is no existing application profile. They can do this by exporting a global variable named NvOptimusEnablement. The Optimus driver looks for the existence and value of the export. Only the LSB of the DWORD matters at this time. A value of 0x00000001 indicates that rendering should be performed using High Performance Graphics. A value of 0x00000000 indicates that this method should be ignored. Example Usage: extern "C" { _declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001; } .. _global_profile_settings: " I have posted a bugreport to Force high performance GPU for Nvidia Optimus systems https://jogamp.org/bugzilla/show_bug.cgi?id=1113 |
Free forum by Nabble | Edit this page |