Hello Guest

[Solved] Problem with creating textures using IntBuffer

  • 5 Replies
  • 4601 Views
Please help me. I cannot understand what I doing wrong (result is white square instead of chessboard):

Code: [Select]
package examples;

import java.nio.ByteBuffer;
import java.nio.IntBuffer;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import static org.lwjgl.opengl.GL11.*;

public class Test  {
public static void main(String[] argv) {
try {
Display.setDisplayMode( new DisplayMode( 800, 600 ) );
Display.create();
} catch ( LWJGLException ex ) {
Logger.getLogger( Test.class.getName() ).log( Level.SEVERE, null, ex );
}

glShadeModel( GL_SMOOTH );
glDisable( GL_DEPTH_TEST );
glDisable( GL_LIGHTING );

glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );
glEnable( GL_TEXTURE_2D );

glMatrixMode( GL_PROJECTION ) ;
glLoadIdentity();
glOrtho( 0d, 800, 600, 0d, -1d, 1d );
glMatrixMode( GL_MODELVIEW ) ;

glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glClearDepth(1);


IntBuffer buffer = ByteBuffer.allocateDirect( 4 * 64 * 64 ).asIntBuffer();
for( int y = 0; y < 64; y++ ) {
for( int x = 0; x < 64; x++ ) {
int color = 0xFFFFFF;
if( ( y + x ) % 2 == 1 ) color = 0;
buffer.put( x + y * 64, color | 0xFF000000 );
}
}


int textureID = glGenTextures();
glBindTexture( GL_TEXTURE_2D, textureID );
glTexSubImage2D( GL_TEXTURE_2D,  0,  0,  0,  64, 64, GL_RGBA, GL_BYTE,  buffer );


glBegin( GL_QUADS );

glTexCoord2d( 0d, 0d );
glVertex2d( 200, 100 );
glTexCoord2d( 1d, 0d );
glVertex2d( 600, 100 );
glTexCoord2d( 1d, 1d );
glVertex2d( 600, 500 );
glTexCoord2d( 0d, 1d );
glVertex2d( 200, 500 );

glEnd();

Display.update();

while( !Display.isCloseRequested() ) Display.processMessages();
}
}
« Last Edit: June 07, 2013, 03:36:26 by Matt Merkulov »

Re: Problem with creating textures using IntBuffer
« Reply #1 on: June 06, 2013, 08:15:38 »
Integers conversion is always a problem, because Java uses unsigned int's signed int's and the C code can have both signs and unsigned integers. so that the first significant bit is the sign bit as of the two's complement form. Similarly, I also avoid to use GL_BYTE and go with GL_UNSIGNED_BYTE,  for all textures.
« Last Edit: June 06, 2013, 08:30:22 by broumbroum »

*

Offline quew8

  • *****
  • 569
  • Because Square Eyes Look More Real
Re: Problem with creating textures using IntBuffer
« Reply #2 on: June 06, 2013, 16:36:42 »
Ie. use "0x000000" instead of "0." But don't do it anyway.

If you're interested, I have a Colour class which stores a colour as 4 floats but it has a nested (is it inner or nested? whichever one isn't static) class called ByteColour (guess what it does) which you can then use to directly put into a ByteBuffer and use with GL_UNSIGNED_BYTE. It is pretty simple but as an example:

Code: [Select]
public class Colour {
    private float red;
    private float green;
    private float blue;
    private float alpha;
    
    @Override
    public String toString() {
        return "Colour:" + red + ", " + green + ", " + blue + ", " + alpha;
    }

    public class ByteColour {
        private byte red, green, blue, alpha;
        public ByteColour() {
            this.red = (byte)((int)Colour.this.red * 255);
            this.green = (byte)((int)Colour.this.green * 255);
            this.blue = (byte)((int)Colour.this.blue * 255);
            this.alpha = (byte)((int)Colour.this.alpha * 255);
        }

        @Override
        public String toString() {
            return "Byte Colour:" + (red&0xFF) + ", " + (green&0xFF) + ", " + (blue&0xFF) + ", " + (alpha&0xFF);
        }
    }
(Selective Quoting)

Edit: It's "you're" not "your"

Re: Problem with creating textures using IntBuffer
« Reply #3 on: June 06, 2013, 16:54:09 »
Ie. use "0x000000" instead of "0." But don't do it anyway.
Code: [Select]
public class Colour {
  ...
    public class ByteColour {
        private byte red, green, blue, alpha;
        public ByteColour() {
            this.red = (byte)((int)Colour.this.red * 255);
            this.green = (byte)((int)Colour.this.green * 255);
            this.blue = (byte)((int)Colour.this.blue * 255);
            this.alpha = (byte)((int)Colour.this.alpha * 255);
        }
    }
...

Hi, I think you might add a debugging output here, that's worth the clue.
1.Because you cast float to an (int), where the float is always 0...1 for colors, your values get stuck onto the 1*255 .. (which turns out into a full brightness :)
2.Okay so let's choose one format for clarifying this. The easier way is to forget a bit about the bytes and use only floats or int, but not mixing, it's confusing !
3.The way you want to get the conversion from a nested class is an immutable pattern . If the Colour wants to change over time, bytecolour should as well. Basically a casual function like Colour.this.toBytes() is lighter.
4. uhh, no four.. ^^
cu

edit : I'm sorry I thought about answering to Matt, but it was quew8 !!! Surely forget about the bytes..
« Last Edit: June 06, 2013, 17:03:38 by broumbroum »

*

Offline quew8

  • *****
  • 569
  • Because Square Eyes Look More Real
Re: Problem with creating textures using IntBuffer
« Reply #4 on: June 06, 2013, 20:20:55 »
Last I checked, this worked. I'm pretty sure that (red * 255) is what is cast to int rather than red itself. (I know there aren't brackets but I think that's how it works). I suppose it could be that I have never tried it with decimal values for the RGBAs (I haven't used this in a very long time) but I would say that it is unlikely and am fairly sure I remember using this with a lovely tone of green.

You quite right about the whole nested class vs METHOD (since we're being technical :D (it could be a function but that's probably bad oop) ) however I did say it was simple (simple minded rather than simple implementation) and it suited my then purposes well. Basically I wanted to be able to manipulate the colour easily which for me means floats then the only time I needed the byte values was to stick it into a texture so I passed the Colour into the texture loading function and created a ByteColour instance in the function.

To conclude, you're right, but I'm not going to listen to you. :P Although you have got me worried about that whole casting issue so I'm going to check up on that when I have time. I'm actually horrified at myself to find I didn't put clarifying brackets in in the first place. It was a VERY long time ago.

Re: Problem with creating textures using IntBuffer
« Reply #5 on: June 06, 2013, 23:39:17 »
Found solution in Slick-util code. I should set texture filters:

Code: [Select]
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );