Problem with scaling of textures in 2d

Started by Odin, February 10, 2014, 20:53:34

Previous topic - Next topic

Odin

Hello

I wanted to start programming a 2d game, that is tile-based. The tiles are arranged in tile-sets (png-files). I seperate them in code, which seems to work just fine. but to look it more retro, i wanted to upscale the tiles from 16x16 to 32x32 at runtime.
I managed to scale them up, but there is some strange bright line between the tiles, but only horizontally. The line is not one-colore, it looks to me that this is some kind of interpolation issue at the edges.
Can you explain this, and if my guess is right, can I change the way, pictures are scaled?

Thank you very much in advance

Cornix

You can use either nearest-filtering or linear-filtering for magnification of the images.
If you use linear-filtering its very likely that the colors at the edges will be a blend between two neighbouring tiles.

Odin

Thank you very much for the fast answer, but it didn't solve my problem.

I should probably show an image, of what's wrong.




Cornix

How do you draw your tiles? Do you use immediate mode? Do you scale them up by using matrix multiplication or do you do it on client side?
It could very well be, that those lines come from imprecise floating point arithmetics. Its always safer to do the scaling on the GPU.

Odin

I scale it the following way:
float dx = texture.getImageWidth() / (float) d1;
float dy = texture.getImageHeight() / (float) d1;
float fx = texture.getWidth();
float fy = texture.getHeight();
glBegin(GL_QUADS);
	glTexCoord2f(i*fx/dx, j*fy/dy);
	glVertex2i(x,y);
	glTexCoord2f((i+1)*fx/dx, j*fy/dy);
	glVertex2i(x+d2,y);
	glTexCoord2f((i+1)*fx/dx, (j+1)*fy/dy);
	glVertex2i(x+d2,y+d2);
	glTexCoord2f(i*fx/dx, (j+1)*fy/dy);
	glVertex2i(x,y+d2);
glEnd();

A tile is d1xd1 pixel in the image i load, and it should be d2xd2 pixel on the screen
since d1 = 16 and d2 = 32, which are powers of 2, i don't think that there should be any floating point imprecision