I'm working on a model loader, and so I can choose if I want to store my vertices data in big endian or little endian. And my code works fine as long as I save in little endian, which is my PC's native byte order. However if I save as big endian it doesn't work, regardless if I set byte order of the bytebuffer to big or little endian, it's as if glBufferDataARB totally ignores the byte order of the buffer. Even when I save my model in little endian, glBufferDataARB ignores the byte order even if I set it to big endian :/
Help, please :)
I'm pretty sure OpenGL always uses the native ordering. Setting the order on a buffer is useful only if you put floats, ints, etc into the buffer using one of the built-in methods, it doesn't affect how OpenGL reads it.
There is no reason to support any other endianness that the native one, in performance sensitive/critical code.
To support it requires lots of bit fiddling. If performance it not one of your worries, you can simply do:
public static FloatBuffer ensureDirectNativeEndian(FloatBuffer fb) {
if(fb.isDirect() && fb.order() == ByteOrder.nativeOrder()) {
return fb;
}
FloatBuffer copy = BufferUtils.createFloatBuffer(fb.remaining());
copy.put(fb); // lots of bit fiddling
copy.flip();
return copy;
}