LWJGL Forum

Programming => Lightweight Java Gaming Library => Topic started by: datSilencer on December 30, 2012, 06:11:27

Title: LWJGL with NVIDIA Optimus cards
Post by: datSilencer on December 30, 2012, 06:11:27
Hello everyone.

I hope I'm not the only one in this situation, but I'm currently developing on a laptop which has dual graphics adapters managed by the so called "NVIDIA Optimus" technology. One is an Intel HD Graphics 4000 GPU and the other is an NVIDIA GT640M GPU.

What I'd like to ask is if it is possible to select at run time, the adapter on which an LWJGL application will run. For example, in my test classes I am displaying the results returned by glGetString() and I my code always defaults to the Intel HD 4000 GPU.

Can I somehow indicate to LWJGL that I'd like to use the NVIDIA GPU for rendering?

I thank you for your time and help! Cheers!
Title: Re: LWJGL with NVIDIA Optimus cards
Post by: spasi on December 30, 2012, 13:37:40
Afaict this requires support for WGL_NV_gpu_affinity (http://www.opengl.org/registry/specs/NV/gpu_affinity.txt), which we can only support in LWJGL 3.0 (currently in development). Read this (http://www.gamedev.net/topic/631488-running-your-opengl-program-on-an-nvidia-optimus-enabled-laptop/) for a possible workaround.
Title: Re: LWJGL with NVIDIA Optimus cards
Post by: datSilencer on December 30, 2012, 23:03:59
Hmmm... I guess I'll have to figure out then how to disable the machine's integrated graphics while I wait for LWJGL 3.0.  8)

Thank you again for your time and help!