Rendering a VertexArray using an offset in the indexBuffer

Started by Bvsemmer, January 10, 2006, 17:36:33

Previous topic - Next topic

Bvsemmer

Hi,

I'm trying to render a Vertex Array "dynamically" by using different offsets in the IndexBuffer.  I'm having problems getting this to work in LWJGL.

In C++ this is very easy (and I'd like to do the same as this code):
glDrawElements(GL_TRIANGLES, numOfIndicesToUse, GL_UNSIGNED_INT, &(Indexbuffer[offsetInIndexBuffer]) );


I browsed the LWJGL API and noticed there are two kinds of glDrawElements functions here:
- One with 2 arguments (The type of the primitive to render and a buffer).  Though giving an offset isn't possible here.
- One with 4 arguments (the type of primitive, the amount of indices you want to use, the type of the indices and an offset (I think ?)).  Though there is no way here to specifiy a Buffer (where the indices are stored).

Ideally I would need a function that combines the two above (specify a buffer with the possibility to set an offset and a number of indices to use).  I'm probably overlooking something, but I can't find it.

Can someone shed some light on this please?  I would be much obliged.

spasi

The first function uses a java Buffer as an input for the indices. By modifying the Buffer position and limit you can achieve what you're describing (only what's between position and limit will be sent to the driver).

The second function is used when VBO is enabled. The offset parameter specifies the offset on the currently bound index buffer object. See the ARB_vertex_buffer_object extension specification for details.