Depth buffer weirdness

Started by middy, July 27, 2004, 11:09:53

Previous topic - Next topic

middy

Well I had some troubles with the depth testing (or so I found out). In LWJGL one has to set a minimum bits per pixel in depth buffer at window create time.

Window.create(GAME_TITLE,modes[mode].bpp , 0, 2,0,0);.

I had not done that and got some weirdness on my home computer (GForce3 ti200) (vertexes behind other drawn infront).

But on my laptop(radeon 9700 mobile). It worked fine without the faults minimum bits per pixel set to 0. So that min bit pr pixel is gfx card dependant?.

Anyway here is the original thread.
http://www.gamedev.net/community/forums/topic.asp?topic_id=258086
um

spasi

You can use 16 or 24 bits as a minimum. Pretty much any card out there supports such depth precisions. It's just a matter of the performance/precision you need.

middy

Well the problem is not the value.

The problem is that even if set to 0. It wouldent be a problem at some cards, apparently they have a default value?.
um

spasi

IIRC, the bits specified in display creation are hints to the minimum your app needs. If the driver/card doesn't support that minimum (why would the radeon not support 0 bits for depth?), it may give you a higher one.

middy

Yep it makes no sense...A value of 16.. damn my app drops 100FPS when I do that (with no visual difference)  :)
um