Hello Guest

Attempting to create a texture from a ByteBuffer

  • 7 Replies
  • 15310 Views
Attempting to create a texture from a ByteBuffer
« on: December 30, 2011, 01:05:08 »
I am trying to do something which is probably very simple. I would like to create a texture by simply making an array of RGB values and then displaying them. However I am unable to get even a simple setup to work (just inputting constant byte values).

I have a Rendering class which is basically as follows:

Code: [Select]
private int TextureID;
private ByteBuffer TextureBuffer;

public void setMapTexture()
{
TextureBuffer= BufferUtils.createByteBuffer(worldwidth*worldheight*3);
TextureBuffer.order(ByteOrder.nativeOrder());
TextureID= glGenTextures();
for(int i=0; i<texturewidth*textureheight; i++)
{
TextureBuffer.put((byte) 255);
TextureBuffer.put((byte) 255);
TextureBuffer.put((byte) 0);
}
TextureBuffer.flip();
glBindTexture(GL_TEXTURE_2D, TextureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, worldwidth, worldheight, 0, GL_RGB, GL_UNSIGNED_BYTE, TextureBuffer);
}

public void renderMap()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glClearColor(1.0f, 0.0f, 1.0f, 1.0f);
glLoadIdentity();

glBindTexture(GL_TEXTURE_2D, TextureID);

glBegin(GL_QUADS);
glVertex3f(0.0f,0.0f,-10.0f);
glTexCoord2f(0.0f,0.0f);

glVertex3f(512.0f,.0f,-10.0f);
glTexCoord2f(0.0f,1.0f);

glVertex3f(512.0f,512.0f,-10.0f);
glTexCoord2f(1.0f,1.0f);

glVertex3f(0.0f,512.0f,-10.0f);
glTexCoord2f(1.0f,0.0f);
glEnd();
}

(I have glEnable(GL_TEXTURE_2D) in the main class)

What happens here is that my quad is displayed but it is white rather than yellow. To check if the data is actually brought through to the texture, I tried glGetTexImage to a separate Bytebuffer and that seemed to work (i.e. 255, 255, 0 was returned for all values).

Sorry for the wall of code. Any help would be appreciated, as most stuff on the internet seems to be directed at loading .pngs or .jpegs to textures.

Re: Attempting to create a texture from a ByteBuffer
« Reply #1 on: December 31, 2011, 00:02:26 »
I guess it's just an OpenGL issue. Does glEnable(GL_TEXTURE_2D) before rendering help?
Download Cultris II, the fastest Tetris clone from http://gewaltig.net/

Re: Attempting to create a texture from a ByteBuffer
« Reply #2 on: December 31, 2011, 10:30:29 »
You glTexCoord2f should be before the glVertex3f calls. The vertex call emits the vertex, so you need all vertex attributes set before then. However this does not explain issues.

You should use a lower case first letter on variable names. It makes it much easier for others to read. TextureID should be textureID. Again not a solution.

You should always have a Util.checkGLError() in there at least when debugging. This may find a problem.

Since I have code very similar and it works, i don't think this is a gl problem. I mean how many textures would be wrong in games? I am putting money on that textures are not properly enabled. I would try giving a glEnable in the loop here just to check.

Also the byte order of a buffer is set correctly by BufferUtils.
If you want a plot read a book and leave Hollywood out of it.

Re: Attempting to create a texture from a ByteBuffer
« Reply #3 on: January 02, 2012, 00:38:40 »
I have changed the texture and vertex order, thanks for pointing that out. I have also put the glEnable(GL_TEXTURE_2D) call in the loop, just in case, but it hasn't helped. I have also put in calls to Util.checkGLError() but that hasn't thrown any errors.

I have just checked with the buffer order being to BIG_ENDIAN and LITTLE_ENDIAN and both didn't throw any errors and gave the same result as before.

I shall make a new project with just this functionality and nothing else to try and narrow down the possibilities.

EDIT:

Here is a self contained project which shows everything I am doing. In a hopefully readable fashion. Help would be appreciated, as I still cannot see what is wrong. Only thing I know I'm not sure is right is the position of glBindTexture related to glTexImage2D..

Code: [Select]
import static org.lwjgl.opengl.GL11.*;

import java.nio.ByteBuffer;
import java.nio.ByteOrder;

import org.lwjgl.BufferUtils;
import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;

public class Main
{
public static int screenwidth = 1024;
public static int screenheight = 1024;
private int textureid;
private ByteBuffer texturebuffer;

public void start()
{
try
{
Display.setDisplayModeAndFullscreen(new DisplayMode(screenwidth, screenheight));
Display.create();
}
catch(LWJGLException e)
{
e.printStackTrace();
System.exit(0);
}

//initiate OpenGL
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-2, 2, 2, -2, -5, 500);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);

setTexture();
glLoadIdentity();

while(!Display.isCloseRequested())
{
if(Display.isActive())
{
render();
}
   Display.update();
}
}

public void setTexture() //Should make an entirely yellow texture?
{
textureid = glGenTextures();
texturebuffer = BufferUtils.createByteBuffer(screenwidth*screenheight*3/4);
texturebuffer.order(ByteOrder.nativeOrder());

for(int i=0; i<screenwidth*screenheight/4; i++)
{
texturebuffer.put((byte) 255);
texturebuffer.put((byte) 255);
texturebuffer.put((byte) 0);
}
texturebuffer.flip();
glBindTexture(GL_TEXTURE_2D, textureid);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, screenwidth/2, screenheight/2, 0, GL_RGB, GL_UNSIGNED_BYTE, texturebuffer);
}

public void render()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glClearColor(1.0f, 0.0f, 1.0f, 1.0f);

glBindTexture(GL_TEXTURE_2D, textureid);

glBegin(GL_QUADS);

glTexCoord2f(0.0f,0.0f);
glVertex3f(-1f,-1f,-10.0f);

glTexCoord2f(0.0f,1.0f);
glVertex3f(1f,-1f,-10.0f);

glTexCoord2f(1.0f,1.0f);
glVertex3f(1f,1f,-10.0f);

glTexCoord2f(1.0f,0.0f);
glVertex3f(-1f,1f,-10.0f);

glEnd();
}

public static void main(String[] args)
{
Main screen = new Main();
screen.start();
}
}
« Last Edit: January 02, 2012, 02:59:53 by Swiffdy »

Re: Attempting to create a texture from a ByteBuffer
« Reply #4 on: January 02, 2012, 09:03:12 »
So i loaded into my IDE and got it to work by adding the following 2 lines (the lines with + the others are context)
Code: [Select]
          texturebuffer.flip();
          glBindTexture(GL_TEXTURE_2D, textureid);
+       glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
+       glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
          glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, screenwidth, screenheight, 0, GL_RGB, GL_UNSIGNED_BYTE, texturebuffer);
Just after you bind the texture the first time (ie when declaring it). This is a bit odd. I would think it should work without that, but it does not here, so perhaps it is something you must set before the textures work properly. FWIW i always have this code.  Oh i removed the odd divide by 4s and 2s everywhere.

Another point to consider. You define the screen size, and just try to make a fullscreen context. This will fail if the mode does not exist, like it often won't. Its better to use the current mode for that.
If you want a plot read a book and leave Hollywood out of it.


Re: Attempting to create a texture from a ByteBuffer
« Reply #6 on: January 02, 2012, 10:43:15 »
Ahh. I assumed there was a default. I almost always have used mipmaps so the default worked. But like i said, i now always have the code. I don't like depending on defaults.
If you want a plot read a book and leave Hollywood out of it.

Re: Attempting to create a texture from a ByteBuffer
« Reply #7 on: January 02, 2012, 12:39:45 »
Aha, thanks for that! I shall definitely remember that in the future.