In the Nehe examples provided with the examples.jar there is some weird stuff (math stuff mainly) happening with the demos that use a cube...
If you rotate the cube (using any arrow key) you'll see that it doesn't reflect reality (the back of the cube will overlapse the front even tho it should be hidden by it ... among other things)
I was wondering if anybody else saw this...
I dunno, it might be related to the z-depth or smthg :roll:
Weird, I ran the same demos in windows and the scaling of the cube worked fine!
Anybody else expenrienced some Linux related problems when running the Nehe demos (those with cubes in it, like Blending, TextureMapping, etc.)
I'll check more into it later today
edit: I'll try and get a screen capture (if I can learn how to do it) ;)
What distro and Desktop Environment (KDE? GNOME? Bluecurve?)
If KDE, look at KSnapshot.
Also, look here: http://www.troubleshooters.com/linux/scrshot.htm
There's probably a dozen other ways, and none too intuitive, unfortunately (at least that I've found.. I think I use image magick or something)
I've not actually tried running any of the demos under Linux yet (waiting for a finalization of 0.7 and still putting together my development environment. In other words, I'm too busy goofing off.. ;) ).
Good luck!
nah I managed to do it (problem was I had to launch the java application after selecting the screenshot, you know when your cursor turns into a crosshair, and the click the window...)
anyhoo, here are the screenies:
(http://openglforums.com/forums/download.php?id=517)
(http://openglforums.com/forums/download.php?id=518)
After some help from the guys at openglforum.com I learned that Linux (as opposed to Win32) doesn't request a depth buffer
Therefore the line in BaseWindow
gl = new GL("LWJGL Example", 50, 50, width, height, bits, 0, 0, 0);
gl = new GL("LWJGL Example", 50, 50, width, height, bits, 0, 16, 0);
(since I'm running in 16bit depth mode)
Shouldn't/couldn't we dynamically set this?
is the method getIntegerv (int, int) still usable?
:)
(again, thx to the guys at openglforums.com) :mrgreen:
BaseWindow should really be asking for a 16 bit depth buffer to make sure it runs all examples etc. properly.
glGetInteger() now simply returns a single int in 0.7.
Cas :)
Quote from: "princec"glGetInteger() now simply returns a single int in 0.7.
Oh, good show! :D
ohhh, 0.7 looks like a promising release!
When is it coming out? 8)
next week if no show stoppers are found, but we just released pre 2 which you can use in the mean time
Yah, I just finished reading the 0.7 pre2 thread!
I'll be checking it out tonight/tomorrow 8)
(then maybe I could get some real coding going on, I need a game idea tho :oops: )
Well hurry up and get something good done, coz we're looking for games to publish. Right about now we look like a one-trick pony...
Cas :)