[Solved] OpenGL shows nothing

Started by FlushFish, January 17, 2014, 19:39:43

Previous topic - Next topic

FlushFish

I have a problem while trying to render a triangle with OpenGL using LWJGL. It's doing nothing, it's neither render something nor does it throw any error. glClear() is working (if I change the color, the color changes).

You can find a GLIntercept Log here: http://pastebin.com/RJ7aySgg

This is my OpenGL Initialization-Code:
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
glFrontFace(GL_CW);
glEnable(GL_DEPTH_TEST);
glDepthMask(true);
glDepthFunc(GL_LEQUAL);
glDepthRange(0.0f, 1.0f);

glViewport(0, 0, 800, 600); // my display size


After that I bind my shaders:
int vertexShader = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(vertexShader , vertexShaderCode);
glCompileShader(vertexShader );

int fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(fragmentShader , fragmentShader Code);
glCompileShader(fragmentShader);

int program = glCreateProgram();
glAttachShader(program, vertexShader);
glAttachShader(program, fragmentShader);
glBindAttribLocation(program, 0, "vert");
glLinkProgram(program);
glDetachShader(program, vertexShader);
glDetachShader(program, fragmentShader);


This are the shaders:

colored.vert
#version 110

uniform mat4 camera;
uniform mat4 model;

uniform vec4 color;

attribute vec3 vert;


varying vec4 fragColorl;

void main() {
    gl_Position = camera * model * vec4(vert, 1);
    fragColorl = color;
}


colored.frag
#version 110

varying vec4 fragColor;

void main() {
    gl_FragColor = fragColor;
}


Then I create vbo, ibo and vao:
// VBO
FloatBuffer vboBuffer = BufferUtils.createFloatBuffer(9);
vboBuffer.put(
    0, 1, 0,
    1, 0, 0,
    -1, 0, 0);
vboBuffer.flip();
vbo = glGenBuffers();
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, vboBuffer, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, 0);

// IBO
ShortBuffer iboBuffer = BufferUtils.createFloatBuffer(3);
iboBuffer.put(0, 1, 2);
iboBuffer.flip();
ibo = glGenBuffers();
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, iboBuffer,  GL_STATIC_DRAW);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

// VAO
vao = glGenVertexArrays();
glBindVertexArray(vao);
glBindBuffer(GL_ARRAY_BUFFER, vbo);

glEnableVertexAttribArray(glGetAttribLocation(program, "vert"));
glVertexAttribPointer(glGetAttribLocation(program, "vert"), 3, GL_FLOAT, false, 3 * Float.SIZE, 0);

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo);
glBindVertexArray(0);


Before rendering:
glUseProgram(program);
glUniform4f(glGetUniformLocation(program, "color"), colorR, colorG, colorB, colorA);
glUseProgram(0);


Rendering:
glClearColor(0, 0, 0, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glUseProgram(program);
FloatBuffer cameraBuffer = BufferUtils.createFloatBuffer(16);
// a simple orthographic camera at the position (0|0|1)
// left: -1; right: 1; bottom: -1; top: 1; zNear: -1; zFar: 1
cameraBuffer.put(
    1.0, 0.0, 0.0, 0.0,
    0.0, 1.0, 0.0, 0.0,
    0.0, 0.0, -1.0, 0.0,
    0.0, 0.0, 1.0, 1.0);
cameraBuffer.flip();
glUniformMatrix4(glGetUniformLocation(program, "camera"), false, cameraBuffer);
FloatBuffer modelBuffer = BufferUtils.createFloatBuffer(16);
// no translation applied, so its an identity matrix
modelBuffer .put(
    1.0, 0.0, 0.0, 0.0,
    0.0, 1.0, 0.0, 0.0,
    0.0, 0.0, 1.0, 0.0,
    0.0, 0.0, 0.0, 1.0);
modelBuffer.flip();
glUniformMatrix4(glGetUniformLocation(program, "model"), false, modelBuffer);

glBindVertexArray(vao);
glDrawElements(GL_TRIANGLES, 3, GL_UNSIGNED_SHORT, 0);
glBindVertexArray(0);

glUseProgram(0);

// check for OpenGL errors
int error_code = glGetError();
if(error_code != GL_NO_ERROR)
    System.err.print("OpenGL Error: " + gluErrorString(error_code));


That are all OpenGL commands I do in the correct order.

It would be nice if anybody could help me!  :'(

GLIntercept Log: http://pastebin.com/RJ7aySgg

quew8

I think you are defining your camera buffer in the transpose to what OpenGL expects. Try transposing it. Maybe that'll get it to work. These errors are so frustrating I know I hope for your sake I'm right.

FlushFish

Thanks for your answer, but it doesn't seem to be the issue. OpenGl is expecting the matrix in column-major order, so
cameraBuffer.put(
    1.0, 0.0, 0.0, 0.0,
    0.0, 1.0, 0.0, 0.0,
    0.0, 0.0, -1.0, 0.0,
    0.0, 0.0, 1.0, 1.0);

would represent this Matrix:
[[1.0 0.0 0.0 0.0]
 [0.0 1.0 0.0 0.0]
 [0.0 0.0 -1.0 1.0]
 [0.0 0.0 0.0 1.0]]

Or am I wrong?

This is how this matrix is computed in my code:
public Matrix4f getProjectionMatrix() {
	Matrix4f matrix = new Matrix4f();

	matrix.m00 = 2 / (right - left);
	matrix.m30 = -(right + left) / (right - left);

	matrix.m11 = 2 / (top - bottom);
	matrix.m31 = -(top + bottom) / (top - bottom);

	matrix.m22 = -2 / (zFar - zNear);
	matrix.m32 = -(zFar + zNear) / (zFar - zNear);

	return matrix;
}

That is how I am sending it to OpenGL:
FloatBuffer buffer = BufferUtils.createFloatBuffer(16);
matrix.store(buffer);
buffer.flip();
glUniformMatrix4(glGetUniformLocation(program, "camera"), false, cameraBuffer);


I hope that helps, but I don't think the issue is in this part...

BionicWave

Just draw a simple shape in 0,0,0
Can you see that?
Disable cullface?
Or maybe youve done all that.
Just a few cents.

FlushFish

Tried what you mentioned... Didn't helped :(

But thank you though ;)

quew8

So some techniques I have used in the past to diagnose these kind of problems.

-In the fragment, try just setting the output frag colour to just white and ignore the input variable.
-In the vertex try disabling the model transform/viewing transform/both.

Obviously make sure that the geometry would actually be visible without the viewing and model transforms. If either of these shows up anything then you know where to look. If not then its probably 70% likely there is a problem in the OpenGL state, 30% something wrong in the contents of the VBO. I'm getting these figures from past experience.

I've had another look through the code and I still can't see anything wrong.

FlushFish

Thank you very much for your help, but it still doesn't work. :'(

Here is my complete code:

https://www.dropbox.com/s/ebehqj1bq86ubx8/GLTest.zip

It is an eclipse project, with only one class  ;)
I hope that helps finding my error.

Fool Running

Quote from: quew8 on January 21, 2014, 22:11:17
So some techniques I have used in the past to diagnose these kind of problems.

-In the fragment, try just setting the output frag colour to just white and ignore the input variable.
-In the vertex try disabling the model transform/viewing transform/both.

Quote from: FlushFish on January 24, 2014, 21:50:25
Thank you very much for your help, but it still doesn't work. :'(

Changing the fragment shader to output just white made the triangle appear for me. ::)
There is definitely something wrong with getting the color to the fragment shader. I'm trying to figure out what that is.

EDIT: Sorry, I couldn't figure out exactly what the problem is because I ran out of time (I'm at work). I might try again later.
Programmers will, one day, rule the world... and the world won't notice until its too late.Just testing the marquee option ;D

FlushFish

Thank you very much for your help!

How exactly have you changed the the output color of the fragment shader?

This
#version 110

varying vec4 fragColor;

void main() {
    gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

doesn't work. It does not give an error, but it's still all black.
What am I doing wrong?

Fool Running

Well, I'm not sure what I did yesterday, but today I can't get a triangle to show. :-\ I did however get something to show by commenting out the glDetachShader() calls and making the frag color all white.

The only thing I can think of is to check to make sure that the shaders are linking and compiling properly (call glGetProgrami with GL_LINK_STATUS and glGetProgramInfoLog).
Programmers will, one day, rule the world... and the world won't notice until its too late.Just testing the marquee option ;D

FlushFish

I don' get any errors by checking for compiler an linking errors.
This is how I check for compiler errors:
int compileStatus = glGetShaderi(vertexShader, GL_COMPILE_STATUS);
if (compileStatus == GL_FALSE) {
	int infoLogLength = glGetShaderi(vertexShader, GL_INFO_LOG_LENGTH);
	String infoLog = glGetShaderInfoLog(vertexShader, infoLogLength);
	throw new RuntimeException(infoLog);
}

Linking errors:
int status = glGetProgrami(program, GL_LINK_STATUS);
if (status == GL_FALSE) {
	int infoLogLenth = glGetProgrami(program, GL_INFO_LOG_LENGTH);
	String infoLog = glGetProgramInfoLog(program, infoLogLenth);
	throw new RuntimeException("Program linking failed: " + infoLog);
}


But, I have noticed something different... Is it normal, that I don't get any linking error, allthough I changed the varying name of one of the shaders to not match to the other shader?

quew8

That is normal. There is no requirement for a fragment shader to use/declare a varying value from the vertex shader and if the vertex shader doesn't set the value of a fragment's varying then it's value in the fragment is just undefined. So the compiler just thinks you're talking about different variables.

Otherwise I have nothing useful to add I'm afraid.

FlushFish

Yeah!

My code works!  ;D

I'm so stupid... Float.SIZE is in bits, not in bytes...  ::)