LWJGL Forum

Programming => OpenGL => Topic started by: bitsNbytes on October 10, 2007, 16:28:15

Title: Differences between Windows and Linux?
Post by: bitsNbytes on October 10, 2007, 16:28:15
hi forum,

I'm currently developing some 3D-applications in Java with LWJGL under windows. Recently i tried it out on linux. It worked fine except the stencil function. The stencil test always passes no matter whether i set the stencil function to GL_EQUAL, 0 or GL_NOTEQUAL, 0 (but the stencil value cant be zero and not zero at the same time ?!).
I wonder if there could be a difference between the windows and linux bindings. I really don't think it is a linux problem, because there is on error and i have the EXT_stencil_wrap extension avaiable and stencil shadows seem to be working in warsow /doom3.

Do i have to change my code (the red book doesn't make a difference between windows and the x-window system on linux)?
Could it really be the system (and why)?

BTW: im using LWJGL version 0.99 on both OS.

thanks for help

(and excuse my bad english :))
Title: Re: Differences between Windows and Linux?
Post by: kappa on October 10, 2007, 18:13:09
LWJGL 0.99  :o , damn thats old (almost 2 years), could be a bug that has been fixed since, would recommend you upgrade to latest version, shouldn't be too difficult as i don't think much api changes have been made since.
Title: Re: Differences between Windows and Linux?
Post by: bitsNbytes on October 11, 2007, 11:00:19
hmm,i  didn't know it was so old

but i still have the same problem (with 1.1.2).

Perhaps i have to create the display explicitly with stencil-buffer, but how?

thank you anyway!
Title: Re: Differences between Windows and Linux?
Post by: bitsNbytes on October 11, 2007, 11:29:04
perhaps i found out the problem:

i can only create a 24bits-per-pixel display on linux (32bit on windows). And i read somewhere that the last 8 bit of a 32bit z-buffer are used for the stencil-buffer. So if i only have 24bit z-buffer this means i have no stencil bits right?
Could that be the problem?
But even if it was: how could i solve that problem? Could i change the depth-buffer to 16 bits? Or do i loose to much precision?

How can it work in other applications?
Title: Re: Differences between Windows and Linux?
Post by: Fool Running on October 15, 2007, 15:10:08
QuotePerhaps i have to create the display explicitly with stencil-buffer, but how?
Yes, you should create the display with a stencil buffer if you are going to use one. I'm surprised it was working if you didn't supply one.
Look at the PixelFormat() constructors, you can pass in the stencil bits. You can pass in the PixelFormat to the Display.create().
Quotei can only create a 24bits-per-pixel display on linux (32bit on windows). And i read somewhere that the last 8 bit of a 32bit z-buffer are used for the stencil-buffer. So if i only have 24bit z-buffer this means i have no stencil bits right?
From what I understand, 32bpp in windows is the same as linux 24bpp, but windows just labels it differently. In 3D graphics the other 8 bits are used for alpha values, not for the stencil values, so this shouldn't be your problem.

Hope that helps ;D
Title: Re: Differences between Windows and Linux?
Post by: bitsNbytes on October 17, 2007, 17:52:12
hey,

i'm going to have a look at it tonight :)

and i guess that will help! thank you!