Hello Guest

Depth buffer weirdness

  • 4 Replies
  • 5555 Views
*

Offline middy

  • *
  • 33
Depth buffer weirdness
« on: July 27, 2004, 11:09:53 »
Well I had some troubles with the depth testing (or so I found out). In LWJGL one has to set a minimum bits per pixel in depth buffer at window create time.

Window.create(GAME_TITLE,modes[mode].bpp , 0, 2,0,0);.

I had not done that and got some weirdness on my home computer (GForce3 ti200) (vertexes behind other drawn infront).

But on my laptop(radeon 9700 mobile). It worked fine without the faults minimum bits per pixel set to 0. So that min bit pr pixel is gfx card dependant?.

Anyway here is the original thread.
http://www.gamedev.net/community/forums/topic.asp?topic_id=258086
um

*

Offline spasi

  • *****
  • 2261
    • WebHotelier
Depth buffer weirdness
« Reply #1 on: July 27, 2004, 11:15:06 »
You can use 16 or 24 bits as a minimum. Pretty much any card out there supports such depth precisions. It's just a matter of the performance/precision you need.

*

Offline middy

  • *
  • 33
Think you got me wrong
« Reply #2 on: July 27, 2004, 11:19:49 »
Well the problem is not the value.

The problem is that even if set to 0. It wouldent be a problem at some cards, apparently they have a default value?.
um

*

Offline spasi

  • *****
  • 2261
    • WebHotelier
Depth buffer weirdness
« Reply #3 on: July 27, 2004, 11:25:30 »
IIRC, the bits specified in display creation are hints to the minimum your app needs. If the driver/card doesn't support that minimum (why would the radeon not support 0 bits for depth?), it may give you a higher one.

*

Offline middy

  • *
  • 33
beats me
« Reply #4 on: July 27, 2004, 11:32:41 »
Yep it makes no sense...A value of 16.. damn my app drops 100FPS when I do that (with no visual difference)  :)
um