Main Menu

16 bit textures

Started by psiegel, July 07, 2003, 12:18:23

Previous topic - Next topic

psiegel

Ultimately this question is more concerning OpenGL than LWJGL, so I thought I'd post it here.  I noticed that many games (Alien Flux included) offer lower resolution graphics (16 bit) to enhance performance on slower machines.  I'm wondering what the general process for doing this is.

Are the source images loaded different?  Is it simply a matter of using GL.RGBA4 instead of GL.RGBA in calls to GL.texImage2D?  If not, is there an easy way to transform a 32 bit image into 16 bit before loading it with texImage2D to avoid shipping multiple versions of the source image?

Paul

princec

No, we just store 24/32 bit images on disk, and upload them to GL as 24/32 bit images. GL is supposed to then convert them to whatever internal format it thinks is best and render them to whatever format the colour buffer uses.

Cas :)

psiegel

What does the "Use LowRes Graphics" option in Alien Flux do then?

princec

That switches the screen resolution down to 640x480 (or thereabouts). It's a help when you've got a fill-rate limited card like an original TNT card or Voodoo, which can't push nearly as many pixels as modern stuff.

Cas :)

psiegel

Aha.  Very interesting.

I guess then my real question would be, what are games like Neverwinter Nights doing when they offer "16 bit texture packs" or "32 bit texture packs".  I always go with the biggest number myself, but I have no idea what the real difference is.  Perhaps in a game that size it's really just the source graphic they're talking about, as the lower bit rate might make hard disk usage lower.

Of course, I wouldn't expect you to actually know the answer to that.  I suppose you're as qualified to make a guess as I am though.  :)

But the important info for me is that I can just load up my 32-bit source graphics and set the Display to something lower, and OpenGL will just figure it out.  That's certainly a relief.

Paul