false instead of GL_FALSE (glVertexAttribPointer) but not everywhere

Started by storm8ring3r, October 22, 2015, 08:38:59

Previous topic - Next topic

storm8ring3r

Hi
I was just wondering, why is it that LWJGL bindings change only some arguments from the
GL_ENUM GL_TRUE GL_FALSE
to java's
true
and
false
? Functions like
glGetShaderi(shaderID, GL_COMPILE_STATUS )== GL_FALSE)
return the
GL_ENUM
type. Was there another design consideration except conistency? It doesn't seem to make sense to me to introduce boolean types if it only complicates matters.

Kai

In your example of glGetShaderi, the reason is that there is no such OpenGL function.
glGetShaderi is a convenience method for glGetShaderiv(), which has an out-parameter (the last parameter) which can produce/hold arbitrary integers.
LWJGL3 just makes it more convenient here to return the produced integer as direct return value of the method, instead of forcing users to allocate a buffer to hold the return value.
It can also not be boolean here because there are other parameter names for that function (other than GL_COMPILE_STATUS), such as GL_INFO_LOG_LENGTH, which make that function produce an arbitrary integer other than 0 or 1.

Other functions taking really boolean parameters, such as the 'transposed' parameter of glUniformMatrix4fv, or the 'normalized' parameter of glVertexAttribPointer, are Java booleans.

Luckily for LWJGL, the OpenGL function signatures use typedefs for int'ish parameters that are really only 0 or 1 by using GLboolean.

Cornix

It would, in theory, be possible to wrap all returned booleans in a getShaderB (or similar) method that will return a regular java boolean but will only allow certain inputs. You could even go a step further and wrap all OpenGL enums into java Enums. But at some point you have to draw the line between convenience and staying true to the original API. What is useful to a bigger user base and what is unnecessary sugar coating.

spasi

Quote from: storm8ring3r on October 22, 2015, 08:38:59Was there another design consideration except conistency? It doesn't seem to make sense to me to introduce boolean types if it only complicates matters.

The first reason is that it keeps compatibility with LWJGL 2.

The second reason is that LWJGL 2 was correct to use boolean were the native type is GLboolean (or similar in other APIs). The native type is:

typedef unsigned char GLboolean;


which means that function arguments (or return values) cannot be 4-byte integers, they really have to be 1 byte for the JNI invocation to work. This maps nicely to JNI's jboolean:

typedef unsigned char   jboolean;


JNI does the conversion from Java boolean to unsigned char (and vice versa) automatically. So there's no higher-level abstraction here and the implementation is simpler really, not more complicated.