Newbie texture mapping problem

Started by wanucha, October 07, 2013, 12:50:13

Previous topic - Next topic

wanucha

I am trying to map square texture to square quad, but it does not work as I expected. If I use 2048x2048 texture (left image), it's OK. If I use 200x200 (right image), then the texture does not cover all the quad and the rest is black. Image format does not have any effect, if I repeat the texture, then each part has the black parts. I have also problems in other cases with very small textures (13x28) for rectangle quads, then the texture exceeds the quad in some directions and does not cover all the quad in other directions. The textures seem loaded correctly, they have the correct dimensions.



GL init:
        glMatrixMode(GL_PROJECTION);
        glLoadIdentity();

        gluPerspective(UnitConverter.radToDeg(camera.getFov() * resY / resX), resX / (float) resY, 0.1f, 100);
        glMatrixMode(GL_MODELVIEW);

        glEnable(GL_TEXTURE_2D);
        glShadeModel(GL_SMOOTH);
        glClearColor(0.0f, 0.5f, 0.0f, 0.5f);
        glClearDepth(1.0f);
        glEnable(GL_DEPTH_TEST);
        glDepthFunc(GL_LEQUAL);

        glEnable(GL_BLEND);
        glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

        glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);


Creating quad:
        floor.bind();
        GL11.glBegin(GL11.GL_QUADS);
        GL11.glTexCoord2f(0, 1);
        GL11.glVertex3f(f1.x, f1.y, f1.z);
        GL11.glTexCoord2f(1, 1);
        GL11.glVertex3f(f2.x, f2.y, f2.z);
        GL11.glTexCoord2f(1, 0);
        GL11.glVertex3f(f3.x, f3.y, f3.z);
        GL11.glTexCoord2f(0, 0);
        GL11.glVertex3f(f4.x, f4.y, f4.z);
        GL11.glEnd();


I downloaded a very simple game in LWJGL and reusing the code. I can understand most of it, but I don't have deeper knoledge and I don't know other options.

How can I fix it, so the texture covers the quad 1:1 in every case?

quew8

Your problem is that lots of implementations of OpenGL don't support non-pot (non power of two) textures. 2048 x 2048 is fine since 2048 is 2 ^ 11, but 200 is not any power of two. The solutions are:

1) Scale your textures up to 256 x 256 in an external program.
2) Load the texture as 256 x 256 (with empty space between 200 and 56). Then use 200 / 256 as the max texture coords instead of 1.

wanucha

Thanks, I googled that it does not matter anymore, but apparently it matters.

quew8

Non-POT texture capabilities are supposed to be core since OpenGL 2.0, however:
Quote
The R300 and R400-based cards (Radeon 9500+ and X500+) are incapable of generic NPOT usage. You can use NPOTs, but only if the texture has no mipmaps.
NV30-based cards (GeForce FX of any kind) are incapable of NPOTs at all, despite implementing OpenGL 2.0 (which requires NPOT). It will do software rendering if you try to use it.
All newer hardware can handle NPOTs of any kind perfectly.

despite this, I often hear cases of non-POT not rendering properly even in new (ish) hardware. So I thoroughly agree with this:
QuoteWhile modern hardware no longer has the power-of-two limitation on texture dimensions, it is generally a good idea to keep using power-of-two textures unless you specifically need NPOTs. Mip-mapping with such textures can have slightly unintended consequences, compared to the power-of-two case.