I've been stuck on one problem for weeks - creating a matrix which projects my two triangles drawn. I am making a 2D game.

I have fiddled with the vertex coordinates and that doesn't do anything when my projection is being multiplied on the matrix stack.

Affected Files:

- Main
- Shader
- ShaderModel
- DefaultShader
- ModelLoader
- Model
- Element
- Renderer
- Projection
- res/shader/default.v
- res/shader/default.f
- res/texture/example.png

Attempting to use 'Windows' from native folder.

Using OpenGL Version: 3.1.0 - Build 9.17.10.3040

Using Shader Version: 1.40 - Intel Build 9.17.10.3040

This is my project. It was written in Eclipse. This is the project folder you can just import and play.

http://www.mediafire.com/download/1ptluo03vw3g6f4/2D_Game.zipI am trying to define pixel space. I rooted my problem specifically to the orthogonal matrix.

This is what I use to create the projection matrix.

`final Matrix4f projection = Projection.createOrtho(0f, (float) window.getWidth(), 0f, (float) window.getHeight(), 0.1f, 100f);`

`GL11.glViewport(-400, -300, 800, 600);`

Vertex Shader:

`#version 400 core`

in vec3 position;

in vec2 texcoord;

out vec2 pass_texcoord;

uniform mat4 matrix_transform;

uniform mat4 matrix_projection;

uniform mat4 matrix_camera;

void main(void) {

pass_texcoord = texcoord;

gl_Position = matrix_projection * matrix_camera * matrix_transform * vec4(position.x, position.y, position.z, 1.0);

}

Fragment Shader:

`#version 400 core`

in vec2 pass_texcoord;

out vec4 out_Color;

uniform sampler2D u_texture;

void main(void) {

out_Color = texture(u_texture, pass_texcoord);

}

My shaders compile perfectly. They also run. Everything is all fine and dandy when I take out my projection matrix. I have successfully drawn my two triangles into a quad.

This is my model. The first table are verticies, the second table are texture coordinates (which go fine), and the last is the indices.

`final Model model = ModelLoader.loadModel(`

new float[] {

0, 10f, //should we convert this to different?

0, 0,

10f, 0,

10f, 10f

}, new float[] {

0, 0,

0, 1,

1, 1,

1, 0

}, new int[] {0, 3, 1, 1, 3, 2});

public static Model loadModel(float[] positions, float[] texture_coords, int[] indicies) {

final int vao = GL30.glGenVertexArrays();

GL30.glBindVertexArray(vao);

final int[] vbos = new int[4];

vbos[0] = store(0, 2, positions);

vbos[1] = store(1, 2, texture_coords);

vbos[2] = GL15.glGenBuffers();

GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, vbos[2]);

final IntBuffer ib = BufferUtils.createIntBuffer(indicies.length);

ib.put(indicies);

ib.flip();

GL15.glBufferData(GL15.GL_ELEMENT_ARRAY_BUFFER, ib, GL15.GL_STATIC_DRAW);

GL30.glBindVertexArray(0);

final Model m = new Model(vao, vbos, indicies.length);

Models.add(m);

return m;

}

public static int store(int id, int len, float[] fltarr) {

final int vbo = GL15.glGenBuffers();

GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vbo);

final FloatBuffer fb = BufferUtils.createFloatBuffer(fltarr.length);

fb.put(fltarr);

fb.flip();

GL15.glBufferData(GL15.GL_ARRAY_BUFFER, fb, GL15.GL_STATIC_DRAW);

GL20.glVertexAttribPointer(id, len, GL11.GL_FLOAT, false, 0, 0);

return vbo;

}

This is the ortho setup I have.

`public static Matrix4f createOrtho(float left, float right, float bottom, float top, float near, float far) {`

final Matrix4f matrix = new Matrix4f();

matrix.setIdentity();

matrix.m00 = 2.0f / (right - left);

matrix.m01 = 0;

matrix.m02 = 0;

matrix.m03 = 0;

matrix.m10 = 0;

matrix.m11 = 2.0f / (top - bottom);

matrix.m12 = 0;

matrix.m13 = 0;

matrix.m20 = 0;

matrix.m21 = 0;

matrix.m22 = -2.0f / (far - near);

matrix.m23 = 0;

matrix.m30 = -(right + left) / (right - left);

matrix.m31 = -(top + bottom) / (top - bottom);

matrix.m32 = -(far + near) / (far - near);

matrix.m33 = 1;

return matrix;

}

I see nothing on screen whatsoever. Not even 1 pixel drawn.

This is how I actually draw.

` final Model model = e.getModel();`

final ShaderModel shader = e.getShader();

GL30.glBindVertexArray(model.getVao());

GL20.glEnableVertexAttribArray(0);

GL20.glEnableVertexAttribArray(1);

GL20.glUseProgram(shader.getShader().getProgram());

GL13.glActiveTexture(GL13.GL_TEXTURE0);

GL11.glBindTexture(GL11.GL_TEXTURE_2D, e.getTexture().getId());

shader.getShader().bindUniformi(shader.getShader().getUniform("u_texture"), 0);

shader.updateView(projection, e.getMatrix(), camera.getMatrix());

GL11.glDrawElements(GL11.GL_TRIANGLES, model.getVertexCount(), GL11.GL_UNSIGNED_INT, 0);

GL20.glUseProgram(0);

GL20.glDisableVertexAttribArray(0);

GL20.glDisableVertexAttribArray(1);

GL30.glBindVertexArray(0);

And my shader model is set up as so.

` private int matrix_projection, matrix_transform, matrix_camera;`

@Override

public void init() { //yes it is inited

shader.bindAttribute(0, "position");

shader.bindAttribute(1, "texcoord");

matrix_projection = shader.getUniform("matrix_projection");

matrix_transform = shader.getUniform("matrix_transform");

matrix_camera = shader.getUniform("matrix_camera");

}

@Override //called in the main update loop

public void updateView(Matrix4f projection, Matrix4f transform, Matrix4f camera) {

shader.bindUniformm(matrix_projection, projection);

shader.bindUniformm(matrix_transform, transform);

shader.bindUniformm(matrix_camera, camera);

}

Any need-to-knows please ask. I am through with this bug.

Solution:

Everything was all fine, but I accidentally messed up my super() calls to super classes, as it wasn't regarding scale.

Switched to a system where scale determined the width and height of geometry (had to update shaders and matrix stuff).

- Previously I believed that the scale being 0-1 would be mapped to what I use with orthogonal, it's not.