LWJGL with NVIDIA Optimus cards

Started by datSilencer, December 30, 2012, 06:11:27

Previous topic - Next topic

datSilencer

Hello everyone.

I hope I'm not the only one in this situation, but I'm currently developing on a laptop which has dual graphics adapters managed by the so called "NVIDIA Optimus" technology. One is an Intel HD Graphics 4000 GPU and the other is an NVIDIA GT640M GPU.

What I'd like to ask is if it is possible to select at run time, the adapter on which an LWJGL application will run. For example, in my test classes I am displaying the results returned by glGetString() and I my code always defaults to the Intel HD 4000 GPU.

Can I somehow indicate to LWJGL that I'd like to use the NVIDIA GPU for rendering?

I thank you for your time and help! Cheers!

spasi

Afaict this requires support for WGL_NV_gpu_affinity, which we can only support in LWJGL 3.0 (currently in development). Read this for a possible workaround.

datSilencer

Hmmm... I guess I'll have to figure out then how to disable the machine's integrated graphics while I wait for LWJGL 3.0.  8)

Thank you again for your time and help!