I am confused with OpenGL versions and signatures

Started by Steyrix, January 13, 2022, 18:43:24

Previous topic - Next topic

Steyrix

Hello there, please excuse me in advance, since I am really new to this forum and english is not my native.

I have recently discovered LWJGL and I was using JOGL before. Now I am trying to put my JOGL-based engine on LWJGL OpenGL rails and I get really confused.
First of all, I can't really figure out which version of GL class should I use. Since most of my functionality was written for GL3-4 API, I guess I will be OK with using org.lwjgl.opengl.GL46?
Moreover, some GL methods' signatures seem unfamiliar to me, despite I have used them before. For example, glGenBuffers() does not have size parameter. Does this mean that
size is calculated automatically, out of box?

Will be thankful for any help or clarification

KaiHH

LWJGL has a class for every OpenGL version, including only the methods that got introduced in that version of OpenGL.
And every newer OpenGL class inherits from the previous version, such that when you use e.g. GL46C you can still use OpenGL functions through that class that were introduced in previous OpenGL core versions.

Usually, when you write an OpenGL application, you define beforehand which minimum OpenGL version you want to target/support, i.e. what the client running your application must support at minimum. Let that be OpenGL 3.3 core, for example, which nowadays is a very save bet.

Then, if you only had one code path in your application that will use OpenGL 3.3 core features only, then you can simply do
import static org.lwjgl.opengl.GL33C.*;

at the beginning of every compilation unit that contains OpenGL calls.

Having explicit classes for specific OpenGL versions also prevents you from walking all over the place and scattering your code with GL 3.3, 4.1, 4.3 and possibly 4.6 function calls, making your application potentially unusable for clients that do not support certain higher OpenGL versions and the functions therein (like all macOS machines, which only support up to OpenGL 4.1 core, but not higher).
So, explicit version classes allow you to make an informed and compile-time-enforcable decision about which OpenGL functions (from which version) you can and want to use, without acidentally making use of a higher-versioned and potentially unsupported (depending on the minimum context version you request via GLFW) OpenGL version.
If you begin to type the name of an OpenGL function and your IDE cannot auto-complete/find it, then it is not supported in the OpenGL version your are currently targeting/importing, and you need to reconsider your approach.

As for glGenBuffers (or other methods taking a NIO Buffer as parameter): When the C API takes in a pointer and a length/count parameter, that is indicative of the number of read or written elements from the pointer, then that is automatically inferred by LWJGL which will use the NIO Buffer's `remaining()` to provide the argument value for the count/length parameter to the native C call.
If you'd rather call the _direct_ native function, you can also do that with the methods having an `n` as prefix, like `nglGenBuffers()`. But that is more cumbersome, since you will be operating with native pointer address as `long` values.
You pretty quickly get used to the "auto-sized NIO Buffers and count/length-parameter-inferring" when working with LWJGL.
It's just a style of binding that LWJGL adheres to.