I'm not a dev who uses LWJGL, but registered specifically to address an issue that has been a pretty long-standing one with Minecraft on Linux on NVIDIA graphics cards. Specifically, the bug that happens in this thread
. I figured it'd be better to create a new thread about it since it's proper etiquette.
This happens on both vanilla Minecraft and when using the "Optifine" graphics optimization mod, which installs LWJGL 2.9.4-nightly-20150209. For the Optifine version, the output for LWJGL is as follows:
... 12 more
However, the old thread ends with the issue being resolved by the person installing xrandr. This is not the case for me. Thus, I'll post some details about my system and the issue.
- I use a hybrid laptop. Intel HD 3000 and NVIDIA GT540m.
- NVIDIA is in use in the PRIME profile settings, meaning it does basically everything and the Intel GPU is just a passthrough.
- Proprietary NVIDIA drivers.
- There is a custom Xorg.conf in place which was autogenerated by the NVIDIA settings tools, but removing/changing it doesn't seem to make a difference to this bug.
- The issue only happens when I connect an external monitor, since LWJGL seems to throw an exception once it needs to enumerate the displays and fails/returns 0 somehow.
- The issue seems to persist for more people online with different specs, as long as it's the NVIDIA+external monitor on laptop combination that's triggered.
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 16384 x 16384
HDMI-0 disconnected (normal left inverted right x axis y axis)
VGA-1 connected 1920x1080+0+0 531mm x 299mm
1280x1024 75.02 60.02
1024x768 75.08 60.00
800x600 75.00 60.32
640x480 75.00 60.00
Again, this has been a long-standing issue for me, but has it already been fixed at any time since then (February 2015, that is)? Or is this still a valid bug?