Porting to LWJGL3

Started by BrickFarmer, December 13, 2014, 17:15:32

Previous topic - Next topic

BrickFarmer

I've started work porting my simple Rift example over to LWJGL3.  I'm new to LWJGL and still trying to find some equivalents of certain functions, so apologies if I've set off in the wrong direction!  My OpenGL is quite rusty, but I want to get this openGL 2.1 example working before I try some 'modern' openGL and start messing around with shaders etc. :)

Anyway the part I'm tripping over for now is with FrameBuffers.  I get:

at org.lwjgl.opengl.GL30.nglGenFramebuffers(GL30.java:2364)
	at org.lwjgl.opengl.GL30.glGenFramebuffers(GL30.java:2393)
	at com.sunshineapps.riftexample.thirdparty.FrameBuffer.<init>(FrameBuffer.java:40)
	at com.sunshineapps.riftexample.RiftClient0440.init(RiftClient0440.java:320)


Looking at the code, it seems to have brought in some static imports for GL30.  Since I'm currently on my 4 year old mac, I'm limited to GL2.1.  However looking at the Apple table for my NVIDIA GeForce GT 330M, frame buffers should be there.

https://developer.apple.com/opengl/capabilities/GLInfo_1085.html

So is the NPE exception being triggered by something else?  Since I dont yet have the source and javadoc being pulled in by maven I'm not quite sure what I'm doing wrong here.

Also, is my usage of 'createFromCurrent' correct please?  ie. that I can pull it in from anywhere, assuming that I'm always on the same thread?  So a keypress handler would set a boolean for the loop() to check, that sort of thing?

I'm quite excited to get something running with LWJGL3 so please have some patience with me if I'm trying to run before I can walk.  I did have a scout around for LWJGL3 examples, but most code still seems to be v2 based.

update: I've put it online now:

https://github.com/WhiteHexagon/example-jovr-lwjgl3-rift



Oculus Rift CV1, MBP 2016 - 2.9 i7 - Radeon Pro 460  OSX 10.12.4,  Win7 - i5 4670K - GTX1070.
Oculus Rift VR Experiments: https://github.com/WhiteHexagon

spasi

There are two ways to use framebuffer: EXT_framebuffer_object and ARB_framebuffer_objects/GL30. It looks like your drivers do not support the ARB version, so you'll have to use the EXT one. LWJGL provides the ContextCapabilities class that you can use to discover which extensions are available:

ContextCapabilities caps = GL.getCapabilities();
if ( caps.OpenGL30 ) {
	// use GL30
} else if ( caps.GL_ARB_framebuffer_object ) {
	// use ARBFramebufferObject
} else if ( caps.GL_EXT_framebuffer_object ) {
	// use EXTFramebufferObject
} else
	throw new UnsupportedOperationException();


You can use GL.getCapabilities() after GLContext.createFromCurrent(). Btw, I noticed in your code that you're calling GLContext.createFromCurrent() twice, you only need to do it once.

BrickFarmer

Thanks, that is now kinda working :) code runs in debug mode when no rift attached.  I had some flickering, but then I remembered the swap buffers is done by the rift SDK.  My next job is to work out what has happened to my black/white chequered floor.

btw Is the preferred approach to use static imports for LWJGL?  I want to start out on the right foot.
Oculus Rift CV1, MBP 2016 - 2.9 i7 - Radeon Pro 460  OSX 10.12.4,  Win7 - i5 4670K - GTX1070.
Oculus Rift VR Experiments: https://github.com/WhiteHexagon

quew8

There isn't particularly a preferred approach, it is completely personal opinion.

I do, that coding style is how the OpenGL API is written anyway, hence all the "gl" prefixing all the functions.

spasi

Yes, using static imports is highly recommended.

BrickFarmer

So I've static'ed my code, yawn! does look cleaner though :) and I tried to tidy up a few other things, but I have a couple more questions please. 

It seems like glfwSwapInterval is redundant, since the Oculus SDK is doing frame buffer swap? 

Also glfwDefaultWindowHints not needed since I'm full screen.

Moving forward, I have my own 'loop' utility for specifying a target FPS.  I'm wondering how I deal with threads and the context in this case.  The utility is based on newSingleThreadScheduledExecutor which means my 'init' is going to be on a different thread, probably edt, to my rendering. Whats the best practice here?  I think even the thread an executor provides can change over time, ie. thread fails and guarantee is to create another from what I recall.

Anyway the performance is looking good so far! at least on my old Mac, and I will do some testing on windows later.

Strange thing is that my lighting is varying with head movement at the moment  ???
Oculus Rift CV1, MBP 2016 - 2.9 i7 - Radeon Pro 460  OSX 10.12.4,  Win7 - i5 4670K - GTX1070.
Oculus Rift VR Experiments: https://github.com/WhiteHexagon

spasi

Keep in mind that GLFW and LWJGL itself are just libraries, not frameworks. Nothing forces you to use any of the provided methods, feel free to experiment and use whatever works for your application. If the Oculus SDK handles frame swapping, then you can just drop glfwSwapBuffers etc.

You can do rendering in a secondary thread, but almost all GLFW functions must run in the main thread, including window creation and event handling. It can get a bit complicated, but you can use message passing between threads to make your life easier. Also, on OS X, make sure to run the JVM with -XstartOnFirstThread.