I'm seeing some strange issues with the LWJGL port of the renderer for Xith3D.
This image is meant to be a cube:
(http://www.xith.org/tutes/filestore/xith3d-lwjgl.jpg)
It should look like this:
(http://xith.org/demo/HelloXith3D_ss.jpg)
Any ideas?
It looks like we've got something reversed??? The outline of the cube is correct.
Will.
Looks like someone forgot to ask for depth buffer bits.
- elias
Things that could be causing that:
- No depth buffer bits requested
- Depth test not turned on
- Depth test has incorrect function selected
- Polygon winding order of vertices incorrect
- Vertices specified in wrong order
- Incorrect face culling
- Vertices put into buffer in wrong order
I seem to recall some craziness with winding order in Xith3D - can't quite remember though...
Thanks! It was the depth bits.
I'm glad it wasn't the other options, I had hoped something so major would not have changed since your initial port.
It was fixed by changing the call to Display.create() to:
Display.create(new PixelFormat(0, 16, 0));
I read this thread: http://lwjgl.org/forum/viewtopic.php?t=656&highlight=depth+buffer+bits (which referenced this one: http://www.gamedev.net/community/forums/topic.asp?topic_id=258086) for details on how to accomplish this.
Some further questions:
[list=1]
- Is it always nessesary to specify a custom PixelFormat?
- What is the default behaviour (can't seem to see any mention of it in the docs (http://www.lwjgl.org/javadoc/org/lwjgl/opengl/Display.html#create())).
- If by default the depth bits are set, is this a bug in the OSX port?
- If by default the depth bits are not set - what use is that method (wouldn't everyone need the depth bits set?)?
- Should I set the other pixel format bits for alpha and stencil?
- Do most people have unique pixel format requirements or is there a common configuration?
[/list:o]
Thanks for your help.
Will.
Quote from: "willdenniss"
[list=1]
- Is it always nessesary to specify a custom PixelFormat?
- What is the default behaviour (can't seem to see any mention of it in the docs (http://www.lwjgl.org/javadoc/org/lwjgl/opengl/Display.html#create())).
[/list:o]
Yes, if you want to specify minimum pixel caps. The defaults are no requirements (0 as a,d,s bit depths).
Quote from: "willdenniss"
[list=1]
- If by default the depth bits are set, is this a bug in the OSX port?
- If by default the depth bits are not set - what use is that method (wouldn't everyone need the depth bits set?)?
[/list:o]
There's probably not a bug in the OSX port. More likely, the particular OpenGL driver you have chooses to give you no depth since you (indirectly through the defaults) specified 0 bit depth as requiment.
Quote from: "willdenniss"
[list=1]
- Should I set the other pixel format bits for alpha and stencil?
- Do most people have unique pixel format requirements or is there a common configuration?
[/list:o]
Yes, if you need them. Most people need some sort of depth buffer, but it's varies whether the alpha and stencil bits are needed.
- elias
I've just committed a change that makes the default no-arg PixelFormat constructor specify a minimum of 8 bits depth buffer.
- elias
Quote from: "elias"I've just committed a change that makes the default no-arg PixelFormat constructor specify a minimum of 8 bits depth buffer.
- elias
Thanks for your answers and the change.
It might be helpful to mention the PixelFormat stuff in the docs for create().
Will.
It would indeed be a good idea if we went through all the Javadoc comments and made sure they were very comprehensive. We're perilously close to 1.0 now.
Cas :)