Problem with integer texture

Started by lwjgl2015, November 02, 2015, 12:55:14

Previous topic - Next topic

lwjgl2015

I am trying to use integer textures. I have attached a complete test program for reference (minus a couple of straightforward utility classes) but the important bits are as follows.

Texture creation (the value is the same throughout the texture):
    private static int createTexture( int texSize, short texValue, boolean ui )
    {
        ByteBuffer texData = ByteBuffer.allocateDirect( texSize * texSize * 2 );
        ShortBuffer sb = texData.asShortBuffer();
        for ( int i = 0, n = texSize * texSize; i < n; i++) sb.put( texValue );
        texData.rewind();

        int texId = GL11.glGenTextures();
        GL11.glBindTexture( GL11.GL_TEXTURE_2D, texId );
        GL11.glTexParameteri( GL11.GL_TEXTURE_2D, GL12.GL_TEXTURE_BASE_LEVEL, 0 );
        GL11.glTexParameteri( GL11.GL_TEXTURE_2D, GL12.GL_TEXTURE_MAX_LEVEL,  0 );
        GL11.glTexParameteri( GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_MAG_FILTER, GL11.GL_NEAREST );
        GL11.glTexParameteri( GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_NEAREST );
            
        GL11.glTexImage2D( 
                GL11.GL_TEXTURE_2D,
                0,
                ui ? GL30.GL_R16UI : GL30.GL_R16,
                texSize,
                texSize,
                0,
                ui ? GL30.GL_RED_INTEGER : GL11.GL_RED, 
                GL11.GL_UNSIGNED_SHORT,
                texData );

        return texId;
    }


Texture usage (fragment shader; un_ui denotes whether I am using unsigned or unsigned shorts):
#version 150

uniform sampler2D un_tex;
uniform usampler2D un_texu;
uniform bool un_ui; 
in vec2 pass_coord;
out vec4 out_color;

void main() {
    float r = un_ui ? float(texture(un_texu, pass_coord).r) / 65535 : texture(un_tex, pass_coord).r;
    out_color = vec4(r,r,r,1);
}


I bind the texture to un_tex or un_texu depending on the un_ui flag and then I fill a quad and read the grayscale value off the screen.

Thus for instance if I use 20600 as the texture value in the unsigned case I would expect to see a grayscale value of 255 * (20600 / 65535) = 80.1, i.e, probably 80 on the screen, but I get 120 instead. The only cases where I get the expected value v is when 255 * v / 65535 has no decimal part, e.g., 12593 or 20560; otherwise the values seem random.

At this point I am completely mystified. I can't see the point at which my code goes wrong.

Kai

Hi,
this is probably/likely because you do not set the ByteOrder (i.e. endianness) of your ByteBuffer.
You create your ByteBuffer with allocateDirect, which will (by default) make it BIG_ENDIAN (or "network byte order").
This affects how multibyte values (i.e. your shorts) will be written to it.
Your computer is likely "little endian" (for example x86), so to make sure your ByteBuffer order matches your processor's byte order, invoke ByteBuffer.order(ByteOrder.nativeOrder()), or simply use BufferUtils.createByteBuffer, which will handle this for you.

lwjgl2015

Yes, that was the problem. Thank you so much. I think I overlooked it because at one point I was using the utilities and all was well of course; then I started calling ByteBuffer.allocateDirect() and forgot to account for byte order.

Now, does OpenGL have a way to retrieve the endianness of the GPU? AFAIK it is not necessarily the same as that of the CPU. I googled a bit but did not find anything solid.

Cornix

Quote from: lwjgl2015 on November 02, 2015, 14:04:15
Now, does OpenGL have a way to retrieve the endianness of the GPU? AFAIK it is not necessarily the same as that of the CPU. I googled a bit but did not find anything solid.
Its highly likely that this is something the driver does for you. After all, the driver knows the GPU and the OS.

lwjgl2015

Thanks. That sounds reasonable. I will assume that.