LWJGL Forum

Programming => Lightweight Java Gaming Library => Topic started by: willdenniss on November 27, 2004, 22:04:12

Title: strange problem with LWJGL Xith3D renderer
Post by: willdenniss on November 27, 2004, 22:04:12
I'm seeing some strange issues with the LWJGL port of the renderer for Xith3D.

This image is meant to be a cube:
(http://www.xith.org/tutes/filestore/xith3d-lwjgl.jpg)

It should look like this:
(http://xith.org/demo/HelloXith3D_ss.jpg)

Any ideas?

It looks like we've got something reversed???  The outline of the cube is correct.

Will.
Title: strange problem with LWJGL Xith3D renderer
Post by: elias on November 28, 2004, 04:00:17
Looks like someone forgot to ask for depth buffer bits.

- elias
Title: strange problem with LWJGL Xith3D renderer
Post by: cfmdobbie on November 28, 2004, 06:33:28
Things that could be causing that:

I seem to recall some craziness with winding order in Xith3D - can't quite remember though...
Title: strange problem with LWJGL Xith3D renderer
Post by: willdenniss on November 28, 2004, 19:33:53
Thanks!  It was the depth bits.

I'm glad it wasn't the other options, I had hoped something so major would not have changed since your initial port.

It was fixed by changing the call to Display.create() to:

Display.create(new PixelFormat(0, 16, 0));


I read this thread: http://lwjgl.org/forum/viewtopic.php?t=656&highlight=depth+buffer+bits (which referenced this one: http://www.gamedev.net/community/forums/topic.asp?topic_id=258086) for details on how to accomplish this.

Some further questions:

[list=1]
Title: strange problem with LWJGL Xith3D renderer
Post by: elias on November 29, 2004, 03:15:16
Quote from: "willdenniss"
[list=1]
  • Is it always nessesary to specify a custom PixelFormat?
  • What is the default behaviour (can't seem to see any mention of it in the docs (http://www.lwjgl.org/javadoc/org/lwjgl/opengl/Display.html#create())).  
    [/list:o]
Yes, if you want to specify minimum pixel caps. The defaults are no requirements (0 as a,d,s bit depths).

Quote from: "willdenniss"
[list=1]
  • If by default the depth bits are set, is this a bug in the OSX port?  
  • If by default the depth bits are not set - what use is that method (wouldn't everyone need the depth bits set?)?
    [/list:o]
There's probably not a bug in the OSX port. More likely, the particular OpenGL driver you have chooses to give you no depth since you (indirectly through the defaults) specified 0 bit depth as requiment.

Quote from: "willdenniss"
[list=1]
  • Should I set the other pixel format bits for alpha and stencil?
  • Do most people have unique pixel format requirements or is there a common configuration?
    [/list:o]
Yes, if you need them. Most people need some sort of depth buffer, but it's varies whether the alpha and stencil bits are needed.

- elias
Title: strange problem with LWJGL Xith3D renderer
Post by: elias on November 29, 2004, 03:16:50
I've just committed a change that makes the default no-arg PixelFormat constructor specify a minimum of 8 bits depth buffer.

- elias
Title: strange problem with LWJGL Xith3D renderer
Post by: willdenniss on November 29, 2004, 06:15:57
Quote from: "elias"I've just committed a change that makes the default no-arg PixelFormat constructor specify a minimum of 8 bits depth buffer.

- elias

Thanks for your answers and the change.

It might be helpful to mention the PixelFormat stuff in the docs for create().

Will.
Title: strange problem with LWJGL Xith3D renderer
Post by: princec on November 29, 2004, 07:20:50
It would indeed be a good idea if we went through all the Javadoc comments and made sure they were very comprehensive. We're perilously close to 1.0 now.

Cas :)