LWJGL3 - Ortho Camera Setup

Started by Baystep, January 09, 2015, 09:56:18

Previous topic - Next topic

Baystep

So I'm almost to the point of pulling my hair out on this. I've been trying to start a simple top-down 2D engine using the new LWJGL3 (so basically just opengl). I'm trying to modulize it by seperating into classes. Now after tracking down some matrix math code (I'm now using LWJGLX for the old lwjgl2 util classes), I'm trying to setup a simple camera, and a quad. Before I introduced the matrix code, and even now when I remove the multiplication code from the vertex shader, it would draw the triangles. So why is this not working? I'm new to OpenGL and matrices but I get the concept and even when I calculate it in my head it should still display.

Does anyone have any code or tutorials sitting around on getting this setup to work? I would like to know how it works rather then just magic code.

p.s. I know that the quad isn't correct, and that one of tris is backwards, I left it that way so I could tell if the camera was facing the right way.

Main Class:
public void load() {
	vertices = BufferUtils.createFloatBuffer(18);
        vertices.put(new float[] {
	            -0.5f,  0.5f,  0.0f,
	            0.5f, -0.5f,  0.0f,
	            -0.5f, -0.5f,  0.0f,
	            0.5f,  0.5f,  0.0f,
	            0.5f, -0.5f,  0.0f,
	            -0.5f, -0.5f,  0.0f
        	});
        vertices.rewind();

        vbo = GL15.glGenBuffers();
        vao = GL30.glGenVertexArrays();
        
        GL15.glBindBuffer (GL15.GL_ARRAY_BUFFER, vbo);
        GL15.glBufferData (GL15.GL_ARRAY_BUFFER, vertices, GL15.GL_STATIC_DRAW);
        GL30.glBindVertexArray(vao);

        GL20.glEnableVertexAttribArray (0);
        GL15.glBindBuffer (GL15.GL_ARRAY_BUFFER, vbo);
        GL20.glVertexAttribPointer (0, 3, GL_FLOAT, false, 0, 0);
        
        // Used to just to get an identity matrix for the model.
        modelTransform = new Transform();

        // Compiles the default shaders.
        shader = new ShaderProgram();
        shader.CompileShaders();
        
        glClearColor(0.1f, 0.2f, 0.3f, 0.0f);
        glViewport(0,0,window.getWidth(), window.getHeight());
        
        // Sets up a camera class that holds the matrices.
        cam = new Camera(window.getWidth(), window.getHeight());
        cam.setOrthographic(); // Set projection matrix
        cam.setPosition(0, 0, -1f); // Set view matrix
        
        // Create a buffer to pass the info to the shader uniforms
        matrixBuffer = BufferUtils.createFloatBuffer(16);
        
        System.out.println("Projection Matrix: \n"+cam.projection.toString());
        System.out.println("View Matrix: \n"+cam.matrix.toString());
        System.out.println("Model Matrix: \n"+modelTransform.matrix.toString());
}

public void render() {
		glClear(GL11.GL_COLOR_BUFFER_BIT);
		
		shader.Use();
		// Uploads the projection (cam.projection) and view (cam.matrix)
		shader.setCameraMatrices(matrixBuffer, cam);
		
		// Uploads the model (model.matrix)
		shader.setModelMatrix(matrixBuffer, modelTransform.matrix);
		
        GL30.glBindVertexArray (vao);
        glDrawArrays (GL_TRIANGLES, 0, 3);
        
        ShaderProgram.Release(); //shortcut for glUseProgram(0)
}


Camera Class:
public void setOrthographic() {
		this.isPerspective = false;
		
		float left = -(width/2);
		float right = (width/2);
		float top = -(height/2);
		float bottom = (height/2);
		
		this.projection.setIdentity();
		this.projection = MatrixMath.ortho(left, right, bottom, top, near, far);
	}


Ortho function:
public static Matrix4f ortho(float left, float right, float bottom, float top, float zNear, float zFar) {
                Matrix4f m = new Matrix4f();

		m.m00 = 2 / (right - left);
		m.m11 = 2 / (top - bottom);
		m.m22 = -2 / (zFar - zNear);
		m.m30 = -(right + left) / (right - left);
		m.m31 = -(top + bottom) / (top - bottom);
		m.m32 = -(zFar + zNear) / (zFar - zNear);

		return m;
}


Shader Code:
public void setCameraMatrices(FloatBuffer buf, Matrix4f projection, Matrix4f view) {
    	if(ProgramInUse != program) {
    		System.err.println("Attempting to upload matrices to the wrong ShaderProgram!");
    		return;
    	}
    		
    	projection.store(buf); buf.flip();
    	GL20.glUniformMatrix4(projMatrix, false, buf);
    	
    	view.store(buf); buf.flip();
    	GL20.glUniformMatrix4(viewMatrix, false, buf);
}


Vertex Shader:
Quote
"#version 150\n" +
"uniform mat4 projectionMatrix;\n" +
"uniform mat4 viewMatrix;\n" +
"uniform mat4 modelMatrix;\n" +
"in vec4 in_Position;\n" +
"void main () {\n" +
"  gl_Position = in_Position;\n" +
"  gl_Position = projectionMatrix * viewMatrix * modelMatrix * in_Position;" +
"}";

Debug Output:
Quote
Initiating game object...
OpenGL Version: 3.2.9752 Core Profile Forward-Compatible Context
Projection Matrix:
0.003125 0.0 0.0 -0.0
0.0 -0.004166667 0.0 0.0
0.0 0.0 -0.0020001999 -1.0001999
0.0 0.0 0.0 1.0

View Matrix:
1.0 0.0 0.0 0.0
0.0 1.0 0.0 0.0
0.0 0.0 1.0 -1.0
0.0 0.0 0.0 1.0

Model Matrix:
1.0 0.0 0.0 0.0
0.0 1.0 0.0 0.0
0.0 0.0 1.0 0.0
0.0 0.0 0.0 1.0

Closing down...

abcdef

Try changing

GL20.glUniformMatrix4(projMatrix, false, buf);


to

GL20.glUniformMatrix4(projMatrix, true, buf);


Opengl uses column based ordering and I think you have row based.

The alternative is to change
        m.m30 = -(right + left) / (right - left);
        m.m31 = -(top + bottom) / (top - bottom);
        m.m32 = -(zFar + zNear) / (zFar - zNear);

to
        m.m03 = -(right + left) / (right - left);
        m.m13 = -(top + bottom) / (top - bottom);
        m.m23 = -(zFar + zNear) / (zFar - zNear);

Baystep

Thanks for the reply. That did nothing. I'm using LWJGLX for the Matrix4f and Vector3f classes, and I believe they are column-major, since they are ports from LWJGL2 Util Package. Does this projection matrix actually look right at all for ortho though?


0.0031250.00.0-0.0
0.00.0041666670.0-0.0
0.00.0-0.20202021-1.020202
0.00.00.01.0

Just to refresh to, the ortho function is being fed these bounds. (I'm only doing 640x480 windowed right now)
float left = -(width/2f);
float right = (width/2f);
float top = (height/2f);
float bottom = -(height/2f);
float near = 0.1f;
float far = 10f;
		
this.projection.setIdentity();
this.projection = MatrixMath.ortho(left, right, bottom, top, near, far);

abcdef

Another issue I have seen is your shader expects a vec4 but you pass in vec3's. You might be getting strange w values passed in which aren't intended.

If you want 2D then pass in a vec2 (just the x and y values) and then use vec4(myvec2input,0.0,1.0) to multiply with your matrix in the shader.


Baystep

Where do you see that? In my vertex shader the matrix is Mat4, and same with the uniforms for handing it over.

abcdef

I wasn't talking about the matrix

Vertex Shader : "in vec4 in_Position;\n" +
Java Code : GL20.glVertexAttribPointer (0, 3, GL_FLOAT, false, 0, 0);

Vertex Shader expects size 4, but you give it size 3. A bit of a mismatch