I'm having a weird error with drawing a background, and I can't figure it out. Basically, instead of clearing the Color Buffer at the beginning of the render, I have a Quad that I render over everything behind it. The idea is that the quad can either simply have a color tied to it (black is the default) or it can have a texture mapped as well.
The setup works fantastically if I'm mapping a texture to the quad, but for some reason it doesn't render if I'm not using an image. Here's the basic code:
public void render()
{
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glLoadIdentity();
drawBackground();
//The rest of the render call works fine.
}
public void drawBackground()
{
if(getTransparency() > 0) // The background can be transparent. This allows for a weak "motion blur" effect when the Color Buffer isn't cleared.
{
double screenWidth = Display.getDisplayMode().getWidth();
double screenHeight = Display.getDisplayMode().getHeight();
getColor().bind(); // This is the color the image is set to. If the background has an image, it's colorized appropriately.
GL11.glTranslated(screenWidth/2, screenHeight/2, 0);
if(rotateWithCamera)
{
GL11.glRotatef(-DataManager.getCurrentCamera().getRotation(), 0, 0, 1f);
}
if(!useImage) // useImage is a boolean that is true when the Background has a valid image, and false otherwise.
{
GL11.glBegin(GL11.GL_QUADS);
GL11.glVertex2f(-width/2, -height/2);
GL11.glVertex2f(width/2, -height/2);
GL11.glVertex2f(width/2, height/2);
GL11.glVertex2f(-width/2, height/2);
GL11.glEnd();
}
else
{
Image skin = getImage(); // Returns a Texture from the Slick-Util library.
skin.bind();
GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0, 0);
GL11.glVertex2f(-width/2, -height/2);
GL11.glTexCoord2f(skin.getWidthRatio(), 0);
GL11.glVertex2f(width/2, -height/2);
GL11.glTexCoord2f(skin.getWidthRatio(), skin.getHeightRatio());
GL11.glVertex2f(width/2, height/2);
GL11.glTexCoord2f(0, skin.getHeightRatio());
GL11.glVertex2f(-width/2, height/2);
GL11.glEnd();
}
if(rotateWithCamera)
{
GL11.glRotatef(DataManager.getCurrentCamera().getRotation(), 0, 0, 1f);
}
GL11.glTranslated(-screenWidth/2, -screenHeight/2, 0);
GL11.glLoadIdentity();
}
}
Am I messing something up in the render process when useImage is false?
glEnable / glDisable GL_TEXTURE_2D when you use fixed function - or you need to use the correct shader :)
That fixed it.
Thanks!