LWJGL 3.3.2 released
Lightweight Java Gaming Library
June 23, 2003, 19:29:44 »
With LWJGL 0.6,
I search through the available displaymodes in search of a displaymode with 800 width, 600 height, 16 bpp and an I don't care what frequency.
After setting the display, I print out the info of the current display, and then I go to create a GL object with the current bpp.
When I ask for 16 bit colordepth I actually get : 800,600,24,75
and an error 'cause GL wants a higher bitdepth.
Huh, ok, so I ask for 800,600,32,-, and I get : 800,600,24,85
and no error while creating an GL object. Mmm.
Is it impossible to get 16 bit?
Are my videodrivers screwy?
And why is there a big black edge around the screen when in fullscreen mode?
Reply #1 on:
June 23, 2003, 19:49:26 »
I'll make a stab in the dark here and say your running Linux? And if that's the case, any mode you ask for you'll have to have set up in your XF86Config.
The fact that you're getting a display with a lower bit depth than what you ask for is a bit disturbing. And I don't know about any black borders - I've never seen one.
Reply #2 on:
June 23, 2003, 22:26:37 »
I'll take a guess that you may be on a laptop? Laptop screens support a single resolution, no bigger no smaller. When you request a smaller resolution than they support, they can either stretch the display to fit the screen, or center the display and put a black border around it.
You should be able to change that behaviour in your BIOS, or in a hardware util provided by your manufacturer.
Reply #3 on:
June 24, 2003, 12:29:22 »
I don't think he's running linux - note the frequencies returned. Linux lwjgl always return 0 Hz.
Play Tribal Trouble
Reply #4 on:
June 24, 2003, 17:45:59 »
Huh, I didn't know that. My own software never checks the frequency. The 24 bit color is what made me suspect Linux. My windows box only lists "High Color" (16 bpp) and "True Color" (32 bpp) as viable options. So I wasn't sure how one might set a Windows box to 24 bpp. However, my Linux box (RH 9) seems to be default set to 24 bpp.
Reply #5 on:
June 24, 2003, 18:51:07 »
Yes, it seems that gfx cards under windows are a little inconsistent regarding bit depth - I have seen cards return 24 bit depth in the control center.
Play Tribal Trouble
Reply #6 on:
June 25, 2003, 09:21:06 »
I think Matrox are the culprits.
The software engineers in that company should be strung up by the toenails and spanked with the biggest Nvidia card available.
Puppygames - Play Revenge of the Titans here!
Reply #7 on:
June 25, 2003, 18:47:12 »
Hum.. sorry, actually , I use winXP with a GF2 classic mx. Should have told you that.
I use driver version 29.42 because newer versions tend to work far from perfect on my card. I'm wondering if I can't this blame all this on the actual hardware implementation though (PowerColor), cause another XP pc nearby with a GF2 400mx(slightly different card, and different manufacturer) with the same driverset doesn't have troubles.
I sometimes see a lot of dirty faults in the rendering as well while using other openGL programs, (games, sure) compared to the other pc,
but ... thats probably just lack of processing power.
I'm going to download different drivers though (again, sigh)(18.8MB argh)
and check for other displaymodes that do this.
Hope you guys have a idea about this.
SMF © 2014