How to use glDrawElements without VAO's

Started by Cornix, July 06, 2013, 11:11:55

Previous topic - Next topic

Cornix

H.
I am trying to draw a simple triangle using one VBO for the vertex data and one vbo for the index data.

My code looks like this:

Initializing the VBO's:
public VBO(float[] vertices, byte[] indices) {
		first_index_id = 0;
		number_of_indices = indices.length;
		number_of_vertices = vertices.length;
		
		vertex_pointer = 0;
		vertex_stride = 5 * SIZE_OF_FLOAT_IN_BYTES; // SIZE_OF_FLOAT_IN_BYTES == 4
		tex_coord_pointer = 3 * SIZE_OF_FLOAT_IN_BYTES;
		tex_coord_stride = 5 * SIZE_OF_FLOAT_IN_BYTES;
		
		FloatBuffer buf = ByteBuffer.allocateDirect(number_of_vertices * SIZE_OF_FLOAT_IN_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();
		buf.put(vertices);
		buf.flip();
		
		data_vbo_id = GL15.glGenBuffers();
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, data_vbo_id);
		GL15.glBufferData(GL15.GL_ARRAY_BUFFER, buf, GL15.GL_STATIC_DRAW);
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
		
		ByteBuffer id_buf = ByteBuffer.allocateDirect(number_of_indices).order(ByteOrder.nativeOrder());
		id_buf.put(indices);
		id_buf.flip();
		
		index_vbo_id = GL15.glGenBuffers();
		GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, index_vbo_id);
		GL15.glBufferData(GL15.GL_ELEMENT_ARRAY_BUFFER, id_buf, GL15.GL_STATIC_DRAW);
		GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);
	}


private static final VBO default_vbo = new VBO(new float[] {
			0, 0, 0,
			0, 0,
			0, 1, 0,
			0, 1,
			1, 1, 0,
			1, 1,
	}, new byte[] {
			0, 1, 2,
	});


When I want to draw the triangle i currently do this:
public void draw() {
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, data_vbo_id);
		GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, index_vbo_id);
		
		GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
		GL11.glEnableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
		
		GL11.glVertexPointer(3, GL11.GL_FLOAT, vertex_stride, vertex_pointer);
		GL11.glTexCoordPointer(2, GL11.GL_FLOAT, tex_coord_stride, tex_coord_pointer);
		GL11.glDrawArrays(GL11.GL_TRIANGLES, first_index_id, number_of_indices);
		
		GL11.glDisableClientState(GL11.GL_VERTEX_ARRAY);
		GL11.glDisableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
		
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
		GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);
	}

But the problem is, the indices are ignored entirely. No matter how I set them, it will always show the same triangle.
So I googled a little bit and found out, that you can use glDrawElements(...) instead. And I guess this is what I need to call to make the indices work.
Then I did this:
public void draw() {
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, data_vbo_id);
		GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, index_vbo_id);
		
		GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
		GL11.glEnableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
		
		GL11.glVertexPointer(3, GL11.GL_FLOAT, vertex_stride, vertex_pointer);
		GL11.glTexCoordPointer(2, GL11.GL_FLOAT, tex_coord_stride, tex_coord_pointer);
		GL11.glDrawElements(GL11.GL_TRIANGLES, number_of_indices, GL11.GL_BYTE, first_index_id);
		
		GL11.glDisableClientState(GL11.GL_VERTEX_ARRAY);
		GL11.glDisableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
		
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
		GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);
	}

But now I see nothing at all. I guess I am using those methods the wrong way.

I would really appreciate your help!
Thanks in advance.

Morin

Solution taken from here: http://lwjgl.org/forum/index.php?topic=4171.0

You'll have to specify GL_UNSIGNED_BYTE as the data type for the element array buffer, not GL_BYTE.

Your code actually results in an "Invalid enum" OpenGL error and corresponding LWJGL exception on my system. If it doesn't do this on your system then this might be worth investigating, because the error message helped me find a solution very quickly. (With GL_UNSIGNED_BYTE the triangle is drawn.)

Cornix

Wow. It worked. Thank you soooo much.

This is crazy, I dont even understand why the GL_BYTE was causing the problem here.

My train of thought was, that the bytes in java are always signed. So I though using UNSIGNED_BYTE would give the wrong values. Seems I was wrong.

Morin

Quote from: Cornix on July 06, 2013, 20:30:27
Wow. It worked. Thank you soooo much.

This is crazy, I dont even understand why the GL_BYTE was causing the problem here.

My train of thought was, that the bytes in java are always signed. So I though using UNSIGNED_BYTE would give the wrong values. Seems I was wrong.

Glad I could help.

OpenGL demands that only unsigned types be used in glDrawElements(). Kind of sensible IMHO because negative indices would leave the buffer. So, as far as OpenGL is concerned, the bytes have to be unsigned.

Now OpenGL wasn't meant to be used for Java, so it also depends on how LWJGL bridges the gap. It does this by using a byte buffer (a buffer in the NIO sense, not in the OpenGL sense) that is filled by the application, then passed to OpenGL. The important part is: The elements of that buffer are 8-bit units *without any type information*. The Java part interprets them as signed bytes. The OpenGL part interprets them as signed or unsigned bytes, depending on whether you pass GL_BYTE or GL_UNSIGNED_BYTE (ignoring for now the fact that glDrawElements() doesn't want signed bytes).

So when you fill a buffer with signed bytes in Java, then pass it to OpenGL with GL_UNSIGNED_BYTE, each element is an 8-bit unit filled under the interpretation of a signed byte, then *re-interpreted* as an unsigned byte. In case you know a bit C programming, this has the same effect as an (unsigned char) cast operator.

Numerically, this maps Java bytes in the range 0..127 to OpenGL bytes in the range 0..127, and Java bytes in the range -128..-1 to OpenGL unsigned bytes in the range 128..255. I usually think of this as, "I'm manipulating unsigned bytes in my Java application in a very clumsy way". Typically I don't use the byte type for exactly this reason, except when storing large amounts of data so I want to save those extra bytes.

Cornix

My chain of thought was that maybe the signed bytes in java, in the range of 1 - 127 might be interpreted as 129 - 255 by openGL when casted to unsigned bytes.