GL_MAX_TEXTURE_SIZE

Started by rgrzywinski, October 02, 2005, 15:09:05

Previous topic - Next topic

rgrzywinski

Why do I need 16 elements when I want to retrieve the maximum texture size?  For example:
final IntBuffer maxTextureSize = BufferUtils.createIntBuffer(16/*CHECK:  LWJGL problem?*/);
GL11.glGetInteger(GL11.GL_MAX_TEXTURE_SIZE, maxTextureSize);

The code for glGetInteger() is:
public static void glGetInteger(int pname, IntBuffer params) {
  long function_pointer = GLContext.getCapabilities().GL11_glGetIntegerv_pointer;
  BufferChecks.checkFunctionAddress(function_pointer);
  BufferChecks.checkBuffer(params, 16);
  nglGetIntegerv(pname, params, params.position(), function_pointer);
}

It feels like someone went a little overboard correcting for a GL_MODELVIEW_MATRIX "bug".

princec

Because we're not sure exactly how many ints will be returned from glGetIntegerv - could be 1, could be 16. We could have made it all complicated, clever and big and looked it up from a table for each get constant... or we could just say you need a buffer at least big enough to return the biggest possible number of ints.

Cas :)

rgrzywinski

As I've said dozens of times in the past ... and why couldn't that have been either documented in the code itself or on the method?  Developers know these things when they write the code.  What's blocking them from commenting it?

Matzon

The problem is that this would have to be documented for each method then. We could add it to the exception thrown: "Note, minimum buffersize for any buffer is 16 elements, regardless of actual needed capacity".

Ofcourse thois should have been documented in the LWJGL Developer guide  :roll: