I've looked everywhere I can think of and still can't figure out how to choose a graphics device. My current issue is when I run my application on a Dell laptop with two graphics cards (integrated Intel and Nvidia). If I run it on the laptop hooked up to an external monitor or docking station is utilizes the Nvidia graphics device. If I run the application using only the laptop monitor it utilizes the integrated Intel graphics device. Is this something that can be overridden? This application utilizes a custom built framework that needs to work the same (or at least similar) for Windows, Linux, and Mac when rendered in AWT, SWT, or NEWT.
I've tried setting the default device using the Nvidia configuration tool but that doesn't seem to work. However, I haven't spent much time going down this route because ultimately this type of workaround falls short of a solution for a deployed application (at least in my case.)