GLSL - matrix4 w value acts like z value

Started by TheBoneJarmer, September 17, 2015, 10:37:05

Previous topic - Next topic

TheBoneJarmer

Hey guys,

At last, I finally decided to start with modern OpenGL. So far, things work out nicely and my experience with the legacy OpenGL comes in quite handy. However, I am a little stuck with something. Here is what I have so far:

Vertex Shader:
#version 130

uniform vec3 pos = vec3(0,0,0);

void main()
{	
	mat4 position = mat4(1.0);
	position[3].x = pos.x;
	position[3].y = pos.y;
	position[3].z = pos.z;
	
	gl_FrontColor = vec4(1,0,0,1);
    gl_Position = gl_ModelViewProjectionMatrix * position * gl_Vertex;
}


Fragment Shader
void main()
{	
    gl_FragColor = gl_Color;
}


What I am trying to do here is to create a translation matrix. According to several tutorials on the web, some wiki's and my own experience I should change the 3rd row's x,y,z value to translate stuff. My problem right now is that my object won't move back and forward (z axis) when I change the position[3].z value. But when I change the position[3].w value it does! It is like the w values serves as the z value while changing the z value has no effect at all.

I really do not understand that. Did I oversee something entirely?

Thanks in advance!
-TBJ

Kai

Your code is just fine, even though you would not need a full-blown 4x4 matrix just to offset some position. :)
Just do:
vec4 newPos = gl_Vertex + vec4(pos, 0.0);


However, it should work with your approach. Do you probably use an orthographic projection via glOrtho or something like that, where there is no perspective and thus no visible "farther" or "nearer" in Z?
You should not alter the 'w' component yourself. In the modelviewprojection matrix this is used for perspective projection vs. orthographic projection. In your "position" matrix this should remain 1.0 (as initialized with your mat4(1.0) constructor).

TheBoneJarmer

Thanks for you reply Kai!

No, I am actually not using glOrtho or anything. I was actually following a tutorial about projection matrices and how shaders could be used to mimic a camera. Therefore the matrices instead of vectors. My rendering code only contains all I need for a triangle to pop up but nothing more. I have not used glModelView, glLoadMatrix or even any sort of stuff. Only a VBO for my triangle and glUseProgram for my shader. Since those work fine and I can do anything else I do not believe this is related to the problem.

I did try your approach just in case but no results either. Could this be a bug in LWJGL itself? I downloaded the library somewhere in the last week of august. I do not know if it was a known bug back then and got solved by now or if this is somehow related to anything else. Thanks for the sidenote about the w value though! Was not planning to touch it anyway, but for some odd reason it does make a difference.

Kai

If you don't use anything else, such as glOrtho or any other matrix methods, then you are in fact using orthographic projection. This is the default. With -1..+1 in Z.
So, adding something to 'z' will not make it look smaller.

TheBoneJarmer

But I thought glOrtho was marked as obsolete and projections had to be done with shaders? Because otherwise I'd be using glLoadMatrix.

Kai

You are using gl_ModelViewProjectionMatrix in your shader, so you are using the (deprecated) OpenGL matrix stack, and with that, I was assuming that you must use some sort of (deprecated) matrix manipulation methods. Be that glLoadMatrix, glLoadIdentity, glMultMatrix, glOrtho, gluOrtho2D, gluPerspective, ..., all of which are deprecated if you want the strict definition of "deprecated", meaning: not in OpenGL 3.2 core anymore.

If you want to do pure OpenGL 3.2 Core without any deprecated/obsolete stuff in it, then you have to use your own uniforms in your shader and upload them via glUniformMatrix4fv.

TheBoneJarmer

The reason I did it is because the tutorial does it. I didn't really knew it was related to that. Thanks for pointing it out though! But still. How do I make the Z axis work? How should I treat projections with shaders? Should I pass matrices to my shaders like you suggested?

quew8

You should probably be using a different tutorial if you want to be using 3.2 context and this tutorial is not. If you are looking for another one, I recommend Lighthouse3D http://www.lighthouse3d.com/tutorials/glsl-tutorial/.

But here is a tip, if you use a context with the core profile, then anything that is not core will not work and you can be sure that you're doing the right thing. Stick this line into your window creation hints.
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);

TheBoneJarmer

I can't. When I tried to use OpenGL 3.20 / GLSL 1.50 I got this message: "Fragment shader: 0:1(10): error: GLSL 1.50 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.00 ES, and 3.00 ES". I do not think my graphics card support this version and I did bought my laptop like 5 months ago. I was not aiming at a specific version anyway. I am very new to shaders and I just googled for shader tutorials and picked what seemed right. I do not know what mayor differences there are between GLSL versions and why I should use different versions.

The tutorials you gave me are for once straightforward compared to what I've seen on the web. Thanks for the GLFW tip bytheway! But I am starting to get a bit confused right now. Parts are making sense but I think I am missing the basics here. My laptop runs (if that is the right word to describe it) OpenGL version 3.0. So, according to Wikipedia my supported GLSL version would be 1.30 right?

If that's the case, wouldn't I be unable to use the GLSL used in those tutorials from lighthouse3D? The tutorials I use at the moment are from http://www.opengl-tutorial.org/. A site I hear a lot about on Stackoverflow. They are written in c++ but their results are nowhere near mine and also they are using GLM. I do not know if there is a Java port for it but damn that would be useful. My own math functions are ported from OpenTK and I had to tweak those methods a lot to get the desired results.

quew8

That's odd. What graphics card does your laptop have? The important thing is the OpenGL version really - you should just use whichever version of GLSL fits with OpenGL. But yes, GLSL 150 should be supported with 3.2.

You can check what GL and GLSL version you have with
System.out.println("Version: " + glGetString(GL_VERSION));
System.out.println("GLSL Version: " + glGetString(GL_SHADING_LANGUAGE_VERSION));


Well the tutorials you're using at the moment (and I've also heard that they're quite good) claim to be for OpenGL 3.3 as well. I would recommend using 3.3 if your laptop can manage it btw.

As for maths libraries, try JOML. It's quite a recent library written essentially to be used with LWJGL. It's fast, efficient, should easily support anything you need and I think it's pretty much recommended by the LWJGL developers. (Although I've never used it myself).

Kai

As for OpenGL version, try setting up GLFW to use an explicit OpenGL 3.2 core context:
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
		glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
		glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
		glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);

If you do not do this, it is system-dependent what OpenGL version you'll get.
On Windows with Nvidia at least, you will get the very latest version with compatibility profile.
On Mac OS X you will get some version before 3.2.

Quote from: quew8 on September 18, 2015, 10:27:02
As for maths libraries, try JOML. It's quite a recent library written essentially to be used with LWJGL. It's fast, efficient, should easily support anything you need and I think it's pretty much recommended by the LWJGL developers. (Although I've never used it myself).
Yes, JOML is awesome! :-D
Have a look at it here: https://github.com/JOML-CI/JOML
You can also use it with Maven and Gradle, see: https://github.com/JOML-CI/JOML/wiki

TheBoneJarmer

Blast. No wonder it wasn't working the way I expected. My graphics card is "Intel Corporation Haswell-ULT Integrated Graphics Controller". According to the command glxinfo I have OpenGL 3.0. That being said, I'll take a look at JOML. Thanks for the tip! At least I know a bit more.

However, this has became a bit off-topic for it does not really answer my original question. I now know that by default the projection is ortho so the Z value is not going to work. However, according to this tutorial from http://www.opengl-tutorial.org I should be able to do so because in that tutorial glOrtho, glLoadMatrix or anything like that was not used. Or I am overseeing something entirely that I should've been aware about the moment I started those tutorials? I downloaded the source but it made no sense at all.

I appreciate your help guys!

Kai

Please note that the tutorial you are using builds the matrices using the GLM library and uploads them via glUniformMatrix4fv.
Therefore, it does not need to use the OpenGL matrix stack functions.
You can do the same with LWJGL and JOML, too.
Have a look at the demos in the joml-lwjgl3-demos repository: https://github.com/JOML-CI/joml-lwjgl3-demos/tree/master/src/org/joml/lwjgl
A very simple demo is this: https://github.com/JOML-CI/joml-lwjgl3-demos/blob/master/src/org/joml/lwjgl/ShaderExample.java

TheBoneJarmer

It works!! Thanks guys! That example helped me out a lot! :D