Hello Guest

Problem with 2D Graphics / Textured Quad

  • 6 Replies
  • 17790 Views
Problem with 2D Graphics / Textured Quad
« on: September 12, 2011, 19:08:13 »
Hi Guys,

I want to make a 2D game using LWJGL. For texture loading i use slick-utils. I draw the Texture on a simple quad to render images. Now my problem is that if i render an image the first row and column of the images pixel are drawn again at the borders of the quad. Which looks like this:



Even if I use the sample code from http://ninjacave.com/slickutil1 it has the same problem.

Code: [Select]
package blank;
import java.io.IOException;
import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import org.lwjgl.opengl.GL11;
import org.newdawn.slick.Color;
import org.newdawn.slick.opengl.Texture;
import org.newdawn.slick.opengl.TextureLoader;
import org.newdawn.slick.util.ResourceLoader;

public class TextureExample {

/** The texture that will hold the image details */
private Texture texture;


/**
* Start the example
*/
public void start() {
initGL(800,600);
init();

while (true) {
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT);
render();

Display.update();
Display.sync(100);

if (Display.isCloseRequested()) {
Display.destroy();
System.exit(0);
}
}
}

/**
* Initialise the GL display
*
* @param width The width of the display
* @param height The height of the display
*/
private void initGL(int width, int height) {
try {
Display.setDisplayMode(new DisplayMode(width,height));
Display.create();
Display.setVSyncEnabled(true);
} catch (LWJGLException e) {
e.printStackTrace();
System.exit(0);
}

GL11.glEnable(GL11.GL_TEXTURE_2D);               
       
GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);         
       
        // enable alpha blending
        GL11.glEnable(GL11.GL_BLEND);
        GL11.glBlendFunc(GL11.GL_SRC_ALPHA, GL11.GL_ONE_MINUS_SRC_ALPHA);
       
        GL11.glViewport(0,0,width,height);
GL11.glMatrixMode(GL11.GL_MODELVIEW);

GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glOrtho(0, width, height, 0, 1, -1);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
}

/**
* Initialise resources
*/
public void init() {

try {
// load texture from PNG file
texture = TextureLoader.getTexture("PNG", ResourceLoader.getResourceAsStream("res/megaman.png"));

System.out.println("Texture loaded: "+texture);
System.out.println(">> Image width: "+texture.getImageWidth());
System.out.println(">> Image height: "+texture.getImageHeight());
System.out.println(">> Texture width: "+texture.getTextureWidth());
System.out.println(">> Texture height: "+texture.getTextureHeight());
System.out.println(">> Texture ID: "+texture.getTextureID());
System.out.println(texture.getWidth());

} catch (IOException e) {
e.printStackTrace();
}
}

/**
* draw a quad with the image on it
*/
public void render() {
Color.white.bind();
texture.bind(); // or GL11.glBind(texture.getTextureID());


GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0,0);
GL11.glVertex2f(100,100);
GL11.glTexCoord2f(1,0);
GL11.glVertex2f(100+texture.getTextureWidth(),100);
GL11.glTexCoord2f(1,1);
GL11.glVertex2f(100+texture.getTextureWidth(),100+texture.getTextureHeight());
GL11.glTexCoord2f(0,1);
GL11.glVertex2f(100,100+texture.getTextureHeight());
GL11.glEnd();
}

/**
* Main Class
*/
public static void main(String[] argv) {
TextureExample textureExample = new TextureExample();
textureExample.start();
}
}

It also happens if I convert a BufferedImage to a Texture with slick-utils instead of loading a file.
Is this a known problem? Am I doing something wrong?

Thanks a lot for your help :)

greetings,
kernelpanix

Re: Problem with 2D Graphics / Textured Quad
« Reply #1 on: September 13, 2011, 02:26:00 »
I think this is pretty common.  I've noticed this for a long time in game development (back when I did DirectX 5+; I hate Windows now...), but have never really figured out the root cause.  It could be OpenGL (version 1.0 specifically); it could be the image loading code (explained below); it could be the drivers; it could be the drawing code; etc.

If your images' width and height are exactly a power of 2, then no problems arise.  I think they also may need to be the same, like 256x256, 64x64, etc.

For images' whose width and height aren't a power of 2, the image loading code copies it into a new image where the width and height are powers of 2 (since they have to be in order to be a texture).  Somehow this ends up messing up the edges.  It usually has to flip the image as well.

There are 2 ways you can fix this:

(1)  Slide the image 1 pixel down and 1 pixel to the right in the file.  Make the 1x1 pixel border around the image the transparent color.

(2) Instead of having the image loading code change the width and height when it loads the image, set the width and height in the file to be the same powers of 2:  512x512, 1024x1024, etc.  Make sure to make the rest of the image the transparent code.  I'm not sure if this fixes it 100% of the time...
« Last Edit: September 13, 2011, 02:29:38 by jediTofu »
cool story, bro

Re: Problem with 2D Graphics / Textured Quad
« Reply #2 on: September 13, 2011, 11:29:36 »
Well, first off, you're drawing a lot of empty space there. Try drawing your quad without the empty space.

I believe it's a rounding problem/error on OpenGL's side. You tell it to draw all of the quad, but because it's forced to round into straight pixels it draws a bit extra. It simply keeps drawing from the opposite corner since textures tile.

Re: Problem with 2D Graphics / Textured Quad
« Reply #3 on: September 13, 2011, 12:45:15 »
Try using the following in your init code (you may have to bind the texture first since I don't know if the texture is bound after getting created with Slick):
Code: [Select]
GL11.glTexParameteri(GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_WRAP_S, GL11.GL_CLAMP);
GL11.glTexParameteri(GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_WRAP_T, GL11.GL_CLAMP);

I'm terrible at explaining it so try http://www.flipcode.com/archives/Advanced_OpenGL_Texture_Mapping.shtml for images of clamping and repeating that might help.

EDIT: There might be a way to make Slick create the texture this way so it doesn't try repeating.
« Last Edit: September 13, 2011, 12:52:41 by Fool Running »
Programmers will, one day, rule the world... and the world won't notice until its too late.Just testing the marquee option ;D

Re: Problem with 2D Graphics / Textured Quad
« Reply #4 on: September 13, 2011, 20:28:17 »
Well, first off, you're drawing a lot of empty space there. Try drawing your quad without the empty space.

I believe it's a rounding problem/error on OpenGL's side. You tell it to draw all of the quad, but because it's forced to round into straight pixels it draws a bit extra. It simply keeps drawing from the opposite corner since textures tile.

Im not sure how to draw the quad without the empty space. Is it possible to have textures with different width and height? Or with width/height not a power of 2?

As I just saw it also renders an incomplete line or two of pixel directly under the image.
I tried using Fool Runnings init code but it didn't change anything :(

Re: Problem with 2D Graphics / Textured Quad
« Reply #5 on: September 13, 2011, 21:22:15 »
Use:

Code: [Select]
GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0, 0);
GL11.glVertex2f(100, 100);

GL11.glTexCoord2f(texture.getWidth(), 0);
GL11.glVertex2f(100+texture.getImageWidth(), 100);

GL11.glTexCoord2f(texture.getWidth(), texture.getHeight());
GL11.glVertex2f(100+texture.getImageWidth(),100+texture.getImageHeight());

GL11.glTexCoord2f(0, texture.getHeight());
GL11.glVertex2f(100,100+texture.getImageHeight());
GL11.glEnd();

I believe that will fix your problem. You're using getTexture[Dimension] and 1.0, instead of getImage[Dimension] and get[Dimension].

getTexture and 1.0 instruct OpenGL to use the edge of the Texture, not the image you loaded onto it. However, you can instruct OpenGL to only render a quad containing the Image you loaded by making calls referencing the Image's relative dimensions on the texture containing it.

Re: Problem with 2D Graphics / Textured Quad
« Reply #6 on: September 14, 2011, 09:34:33 »
Use:

Code: [Select]
GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0, 0);
GL11.glVertex2f(100, 100);

GL11.glTexCoord2f(texture.getWidth(), 0);
GL11.glVertex2f(100+texture.getImageWidth(), 100);

GL11.glTexCoord2f(texture.getWidth(), texture.getHeight());
GL11.glVertex2f(100+texture.getImageWidth(),100+texture.getImageHeight());

GL11.glTexCoord2f(0, texture.getHeight());
GL11.glVertex2f(100,100+texture.getImageHeight());
GL11.glEnd();

I believe that will fix your problem. You're using getTexture[Dimension] and 1.0, instead of getImage[Dimension] and get[Dimension].

getTexture and 1.0 instruct OpenGL to use the edge of the Texture, not the image you loaded onto it. However, you can instruct OpenGL to only render a quad containing the Image you loaded by making calls referencing the Image's relative dimensions on the texture containing it.


Thanks a lot, that fixed it :)