Need an little help with text rendering

Started by atom, August 10, 2017, 17:47:26

Previous topic - Next topic

atom

Hello!

I found that gitbook a few days ago, and I decided to try and understand more about lwjgl. I've read the book, which is quite amazing, followed the examples, and when i finished, i wanted more, so i told myself : "Mmmmm why not extending the example a bit and try something funny like a game?"

So my starting point was the last example from the book : https://github.com/lwjglgamedev/lwjglbook/tree/master/chapter27/c27-p2
I added a few things and deleted some and then I felt like printing some text on the screen. And all i could do is a black rectangle...

All my scene renders well, with all its elements, through its own shader program. My Hud has its own shader program too, which is much simpler, but doesn't render my text correctly.

What i do to make print some text :
- use awt to load a font and generate a texture from it (works correctly, i tried writing the texture into a png file and it's ok)
- compute quads for each character and choose what part of my texture it should be and put that into texture coordinates (works ok, i checked the values just before upload to gpu)
- draw the quads on the screen (doesn't work correctly)

I see a black rectangle where i should see text, so i used glPolygonMode(GL_FRONT_AND_BACK, GL_LINE) to see a bit more. And in fact all my quads are drawn where they should be, so my problem comes from the color or the texture. In my fragment shader, this computes the color :
fragColor = colour * texture(texture_sampler, outTexCoord);

if i change it to
fragColor = colour;

my rectangle isn't black anymore, it's the color i wanted my text to be (colour is a vec4 uniform set by my text's material's ambient color so this makes sense ^^), my problem is probably in my texture.

So i disabled all my rendering, except the HUD, and i still see a blue rectangle ( i wanted to print blue text =) ) which leads me to this part of my fragment shader :

texture(texture_sampler, outTexCoord);


If i understand correctly, texture() computes a color from a texture sampler and texture coodinates so either my texture sampler is wrong or my coordinates are.

texture_sampler is never set, but i read that the default is GL_TEXTURE0, and my text texture is bound to the same.

OutTexCoord is passed by my vertex shader :
    outTexCoord = texCoord;

and before that
layout (location=1) in vec2 texCoord;

which is my mesh's vao and the texture coordinate i put there are correct.

So i'm stuck, i don't know where to go from here, any help would be great.

Thank you for your time and sorry for the bad english ^^

(EDIT: i just realized that mybe i posted in the wrong section, maybe this would be bettre in the opengl section... apologies)

Kai

Did you set the minification filter of the texture?

The number 1 error people usually make when they see a black texture is not setting the texture minification filter. The minification filter specifies how the sampler should filter the texture when its texel footprint is smaller than a single pixel on the framebuffer. By default, this is set to GL_NEAREST_MIPMAP_LINEAR, which will return black when you do not create a mipmap chain for your texture.

Try this when you initialize your texture and have it bound to GL_TEXTURE_2D:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

Cornix

I would recommend just getting basic text rendering working first and then gradually advancing it. Start without a custom shader using immediate mode (glBegin, glEnd) like in the examples. Then try to use VBO's, then add shaders. Step by step. Test your code at every step to see if it still works correctly.

atom

Thank you for your answers.

@Kai : I already set the minification filter in my texture class

@Cornix : You're right, i should have started without using shaders. But my shaders for scene rendering work so well that i got tempted :)

I tried something last night. To be sure my font texture was correctly generated, in addition to create a ByteBuffer which my texture is created from, i decided to write a png image. And the png was ok. But last night i tried to create my texture from this png file instead of the ByteBuffer, and my text renders correctly like that. So i probably have an error in the code i use to convert from the font image to the ByteBuffer, but i can't see it  ???
ByteArrayOutputStream out = new ByteArrayOutputStream();
//FileOutputStream out2 = new FileOutputStream("test.png");
ImageIO.write(img, "png", out);
//ImageIO.write(img, "png", out2);
//out2.flush();
out.flush();
ByteBuffer b = ByteBuffer.wrap(out.toByteArray());
b.flip();
texture = new Texture(b);
//texture = new Texture("test.png");
out.close();
//out2.close();


If i use the ByteBuffer b to generate the texture, my text doesn't render, if i use "test.png" to generate my texture, the text is ok.

Cornix

Are you sure you aint getting the formats mixed up? You need to have the image data in the right format for OpenGL or else you get garbage output.

By the way: Using AWT (ImageIO, BufferedImage, etc) together with LWJGL3 is not a good idea. Your code is going to crash on Mac OS. But even on Windows and Linux it will be less efficient because of all the overhead. LWJGL3 comes with bindings to STB to load images into textures and to generate textures for true type fonts.

atom

You're probably right, and i'll look into that direction.

The function i use to create a texture from a file uses STB and works well. So i'll try to get AWT away everywhere else as you advice.

Thanks a lot :)

atom

Hello again =)

I now use stb for creating texts, and it renders well... on linux, but not on windows (haven't tried osx).
The stb part is working well on both os, if i write the bytebuffer stb packs as a bitmap image, i get a proper bitmap file with all my characters.
But if i create a texture from that same bytebuffer, it only renders properly on linux, so there is surely something i do wrong when creating my texture that linux allows, but windows doesn't.
I don't know if that's relevant but i tried with both an nvidia and an amd gpu.

Here's my texture creation code :
    public Texture(ByteBuffer imageData, int width, int height) {
            this.width = width;
            this.height = height;
            this.id = glGenTextures();
            glBindTexture(GL_TEXTURE_2D, this.id);
            glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
            glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, this.width, this.height, 0, GL_ALPHA, GL_UNSIGNED_BYTE, imageData);
    }


and here's the relevant part of my fragment shader :
uniform sampler2D texture_sampler;
uniform vec4 colour;
[...]
    	vec4 texcol = texture(texture_sampler, outTexCoord);
    	fragColor = vec4(colour.r, colour.g, colour.b, texcol.a);


As adviced before, i would have tested with fixed pipeline functions, but (and that's another problem) i could only manage to have a core profile.

So, as i understand things, stb creates a 1 channel bitmap in my bytebuffer, that's why i use GL_ALPHA as the type of my glTexImage2D() call. I read that format and internalFormat shoud match, but i get the same problem in linux i have in windows if i set also the internalFormat to GL_ALPHA (and in the STBTrueType examples both are GL_ALPHA). Maybe it comes from my use of the texture() function in the fragment shader...

On linux, the text renders correctly with that code, but in windows, i only get colored squares instead of characters, like if texcol.a was always 1.

atom

@Cornix : you were probably right, i think i had the formats mixed up.

I found a working solution. I started by having the same format and internalFormat in my glTexImage2D(), tried different formats and modifying my fragment shader accordingly. with GL_RED, my text renders correctly on both linux and windows. So here's what i have now :
glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, this.width, this.height, 0, GL_RED, GL_UNSIGNED_BYTE, imageData);

and :
    	vec4 texcol = texture(texture_sampler, outTexCoord).rrrr;
    	fragColor = texcol*colour;

it works as expected. As i understand the glTexImage2D() function, having GL_RED format is like saying "i only give a red channel to create my texture", and having GL_RED internalFormat is like saying "My texture will zero out all channels except the red one". So, in the fragment shader, i broadcast the red channel over the others with texture().rrrr so my red channel becomes a complete rgba information.
But, following the same logic, i don't understand why this code doesn't work :
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, this.width, this.height, 0, GL_ALPHA, GL_UNSIGNED_BYTE, imageData);

and :
    	vec4 texcol = texture(texture_sampler, outTexCoord).aaaa;
    	fragColor = texcol*colour;