LWJGL 2.9.2

Started by spasi, December 28, 2014, 17:40:05

Previous topic - Next topic

spasi

The LWJGL team is proud to present the latest and final release of "legacy" LWJGL:

LWJGL 2.9.2

OpenGL:

       
  • Add: OpenGL 4.5 and new extensions
  • Add: Force high performance GPU for Nvidia Optimus systems
  • Fix: Clear(Named)BufferSubData requires an explicit size argument
  • Fix: Removed duplicate constants


OSX:

       
  • Fix: More CALayer/Display.getParent() bugs
  • Fix: Viewport issue with retina displays when going fullscreen
  • Fix: Use NSOpenGLProfileVersion3_2Core for any OpenGL 3.2+ version

Linux:

       
  • Fix: Compatibility for systems with multiple monitors
  • Fix: Several Xrandr issues
  • Fix: Create virtual key up events for all pressed keys when the display loses focus.

Windows:

       
  • Fix: WM_WINDOWPOSCHANGED coordinate bug
  • Fix: Create virtual key up events for all pressed keys when the display loses focus.

Build:

       
  • Add: Support for OpenBSD
  • Add: Ported code generator from apt to javac annotation processing
  • Fix: Refactored OS X build script to support newer SDK versions

Download here: https://sourceforge.net/projects/java-game-lib/files/
Full changelog: http://legacy.lwjgl.org/changelogs/2.9.2-changelog.txt

Notice: We'd like to remind people to include the copyright, conditions and disclaimer statement for LWJGL in their products, as required by the license. Though we are not about to claim foul in any way, it would be nice to see a link back to lwjgl.org in the credits or documentation at the very minimum.

kappa

Ruben01 just confirmed that LWJGL 2.9.2 is now also available on Maven Central. Normal instructions apply as set out here (just replace the 2.8.0 parts with 2.9.2).

badlogic

edit: ignore me, i forgot to update a JAR. Everything works as expected.

Thanks for this! Just been testing it, and i'm afraid fullscreen on Mac OS X retina devices still fails. I only see like 1/4th of what i'm supposed to see (and yes, matrices/viewport etc. are all set correctly). So, is there any special flag i need to set?

Thanks!

Hanksha

Quote from: spasi on December 28, 2014, 17:40:05
Add: Force high performance GPU for Nvidia Optimus systems

Wow finally, I was afraid it never get fixed. That's really good news. Thanks!

spasi

Hey Hanksha,

I haven't had any confirmation yet that it actually works. In fact, I looks like the "NvOptimusEnablement" export will not be detected in a dll; it must be exported by the executable that launched the process (i.e. java.exe).

Hanksha

Oh I see, false hope then.
I'm not so familiar with exporting variables, do you have any good link explaining the process in detail?

Will it be possible knowing that I am wrapping the jar in executable?

spasi

Details here.

Wrapping the jars won't be enough. The executable must export the variable and also launch the JVM using JNI (in the same process).

Hanksha

Ok thanks, I'll try that out.

Kai

Hello,

my finding is that Nvidia Optimus does not work reliably now, and never did in the past. At least for me. :)

Even with a plain native application verifiably exporting that symbol.
However I used GLFW for that, which also does not do anything particularly different than LWJGL 2.9 when creating the OpenGL context and also does export that NvOptimusEnablement flag with value 1.

My specs were:
  Laptop: Lenovo W530
  Dedicated: Nvidia Quadro K2000M (driver version 341.21)
  Integrated: Intel HD Graphics 4000
  OS: Windows 7 64-bit

One weird thing I noticed is that even explicitly enabling the native application's exe file to use the dedicated card in the optimus graphics settings of the Nvidia control panel did not work.
It was still using the integrated card.

Another weird thing at least on my laptop: If I plug in an external monitor, the dedicated card IS ALWAYS being used, also despite of any optimus settings. So, without external monitor: always integrated, with external: always dedicated. Might also have been a thing with my notebook, though. I also set in the BIOS the flag: "Use Optimus Technology" instead of "use dedicated" or "use integrated". And the OS also did recognize both cards. So that should have worked.

To sum it up: Optimus totally sucks! :)

Regards,
Kai