LWJGL Forum

Programming => OpenGL => Topic started by: CodeBunny on September 19, 2011, 03:03:59

Title: GLSL Shaders
Post by: CodeBunny on September 19, 2011, 03:03:59
So, I'm trying to learn how to use shaders, but the tutorial on the wiki (http://lwjgl.org/wiki/index.php?title=GLSL_Shaders_with_LWJGL (http://lwjgl.org/wiki/index.php?title=GLSL_Shaders_with_LWJGL)) is not working for me.

First off, the code, when copy-pasted, was wrong. :-\ Pretty badly so. Aside from a typo that actually meant the code had compile errors, the main class didn't even access the class that ran the shaders. Did whoever wrote this tutorial even test-run the code? It really should get fixed.

Secondly, when I fixed those problems (had the main class create an instance of the Box class and use it, fixed the typo), the shaders I created with copy and paste didn't work. Both the vertex and the fragment shaders didn't process past printLogInfo(); the returned value for the log length was 0 in both cases. I also checked that the files were being read appropriately; they were.

Since I really don't have a very good idea of how to work with shaders, the fact that this tutorial is broken is very detrimental to my ability to figure out what's going on.

Here's the code I'm using (it has been very slightly modified from the code on the wiki):

Main.java:
Code: [Select]
import org.lwjgl.opengl.GL11;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import org.lwjgl.util.glu.GLU;

/*
 * Sets up the Display, the GL context, and runs the main game
 loop.
 *
 * @author Stephen Jones
 */
public class Main {

private boolean done = false; // game runs until done is set to true

private Box box;

public Main() {
init();
while (!done) {
if (Display.isCloseRequested())
done = true;
render();
Display.update();
}

Display.destroy();
}

private void render() {
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
GL11.glLoadIdentity();
box.draw();
}

private void init() {
int w = 1024;
int h = 768;

try {
Display.setDisplayMode(new DisplayMode(w, h));
Display.setVSyncEnabled(true);
Display.setTitle("Shader Setup");
Display.create();
} catch (Exception e) {
System.out.println("Error setting up display");
System.exit(0);
}

GL11.glViewport(0, 0, w, h);
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GLU.gluPerspective(45.0f, ((float) w / (float) h), 0.1f, 100.0f);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glLoadIdentity();
GL11.glShadeModel(GL11.GL_SMOOTH);
GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
GL11.glClearDepth(1.0f);
GL11.glEnable(GL11.GL_DEPTH_TEST);
GL11.glDepthFunc(GL11.GL_LEQUAL);
GL11.glHint(GL11.GL_PERSPECTIVE_CORRECTION_HINT, GL11.GL_NICEST);
box = new Box();
}

public static void main(String[] args) {
EngineDirectory.init();
LWJGLNatives.load();
new Main();
}
}

Box.java
Code: [Select]
import org.lwjgl.opengl.GL11;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.nio.ByteBuffer;
import java.nio.IntBuffer;
import org.lwjgl.BufferUtils;
import org.lwjgl.opengl.ARBShaderObjects;
import org.lwjgl.opengl.ARBVertexShader;
import org.lwjgl.opengl.ARBFragmentShader;

/**
 * The vertex and fragment shaders are setup when the box object is constructed.
 * They are applied to the GL state prior to the box being drawn, and released
 * from that state after drawing.
 *
 * @author Stephen Jones
 */
public class Box {

/*
* if the shaders are setup ok we can use shaders, otherwise we just use
* default settings
*/
private boolean useShader = true;

/*
* program shader, to which is attached a vertex and fragment shaders. They
* are set to 0 as a check because GL will assign unique int values to each
*/
private int shader = 0;
private int vertShader = 0;
private int fragShader = 0;

public Box() {

/*
* create the shader program. If OK, create vertex and fragment shaders
*/
shader = ARBShaderObjects.glCreateProgramObjectARB();

if (shader != 0) {
vertShader = createVertShader("shaders/screen.vert");
fragShader = createFragShader("shaders/screen.frag");
} else
useShader = false;

/*
* if the vertex and fragment shaders setup sucessfully, attach them to
* the shader program, link the sahder program (into the GL context I
* suppose), and validate
*/
if (vertShader != 0 && fragShader != 0) {
ARBShaderObjects.glAttachObjectARB(shader, vertShader);
ARBShaderObjects.glAttachObjectARB(shader, fragShader);
ARBShaderObjects.glLinkProgramARB(shader);
ARBShaderObjects.glValidateProgramARB(shader);
useShader = printLogInfo(shader);
} else
useShader = false;
}

/*
* If the shader was setup succesfully, we use the shader. Otherwise we run
* normal drawing code.
*/
public void draw() {
if (useShader) {
ARBShaderObjects.glUseProgramObjectARB(shader);
}
GL11.glLoadIdentity();
GL11.glTranslatef(0.0f, 0.0f, -10.0f);
GL11.glColor3f(1.0f, 1.0f, 1.0f);// white

GL11.glBegin(GL11.GL_QUADS);
GL11.glVertex3f(-1.0f, 1.0f, 0.0f);
GL11.glVertex3f(1.0f, 1.0f, 0.0f);
GL11.glVertex3f(1.0f, -1.0f, 0.0f);
GL11.glVertex3f(-1.0f, -1.0f, 0.0f);
GL11.glEnd();

// release the shader
ARBShaderObjects.glUseProgramObjectARB(0);
}

/*
* With the exception of syntax, setting up vertex and fragment shaders is
* the same.
*
* @param the name and path to the vertex shader
*/
private int createVertShader(String filename) {
// vertShader will be non zero if successfully created

vertShader = ARBShaderObjects
.glCreateShaderObjectARB(ARBVertexShader.GL_VERTEX_SHADER_ARB);
// if created, convert the vertex shader code to a String
if (vertShader == 0) {
return 0;
}
String vertexCode = "";
String line;
try {
BufferedReader reader = new BufferedReader(new FileReader(new File(
Box.class.getClassLoader().getResource(filename).toURI())));
while ((line = reader.readLine()) != null) {
vertexCode += line + "\n";
}
} catch (Exception e) {
System.out.println("Fail reading vertex shading code");
return 0;
}
/*
* associate the vertex code String with the created vertex shader and
* compile
*/
ARBShaderObjects.glShaderSourceARB(vertShader, vertexCode);
ARBShaderObjects.glCompileShaderARB(vertShader);
// if there was a problem compiling, reset vertShader to zero
if (!printLogInfo(vertShader)) {
vertShader = 0;
}
// if zero we won't be using the shader
return vertShader;
}

// same as per the vertex shader except for method syntax
private int createFragShader(String filename) {

fragShader = ARBShaderObjects
.glCreateShaderObjectARB(ARBFragmentShader.GL_FRAGMENT_SHADER_ARB);
if (fragShader == 0) {
return 0;
}
String fragCode = "";
String line;
try {
BufferedReader reader = new BufferedReader(new FileReader(new File(
Box.class.getClassLoader().getResource(filename).toURI())));
while ((line = reader.readLine()) != null) {
fragCode += line + "\n";
}
} catch (Exception e) {
System.out.println("Fail reading fragment shading code");
return 0;
}
ARBShaderObjects.glShaderSourceARB(fragShader, fragCode);
ARBShaderObjects.glCompileShaderARB(fragShader);
if (!printLogInfo(fragShader)) {
fragShader = 0;
}

return fragShader;
}

/*
* oddly enough, checking the success when setting up the shaders is verbose
* upon success. If the reference iVal becomes greater than 1, the setup
* being examined (obj) has been successful, the information gets printed to
* System.out, and true is returned.
*/
private static boolean printLogInfo(int obj) {
IntBuffer iVal = BufferUtils.createIntBuffer(1);
ARBShaderObjects.glGetObjectParameterARB(obj,
ARBShaderObjects.GL_OBJECT_INFO_LOG_LENGTH_ARB, iVal);

int length = iVal.get();
if (length > 1) {
// We have some info we need to output.
ByteBuffer infoLog = BufferUtils.createByteBuffer(length);
iVal.flip();
ARBShaderObjects.glGetInfoLogARB(obj, iVal, infoLog);
byte[] infoBytes = new byte[length];
infoLog.get(infoBytes);
String out = new String(infoBytes);
System.out.println("Info log:\n" + out);
return true;
}
System.err.println("Error accessing shader log!");
return false;
}

}

screen.vert
Code: [Select]
varying vec4 vertColor;

void main(){
    gl_Position = gl_ModelViewProjectionMatrix*gl_Vertex;
    vertColor = vec4(0.6, 0.3, 0.4, 1.0);
}

screen.frag
Code: [Select]
varying vec4 vertColor;

void main(){
    gl_FragColor = vertColor;
}

So, why aren't the shaders working? The syntax seems correct when I look at other examples, and I seem to be sending it to the graphics card appropriately.

And yes, my graphics card definitely supports shaders.

PS: Sorry if my tone is a little irritated; I've been banging my head against the wall on this one for a while.
Title: Re: GLSL Shaders
Post by: spasi on September 19, 2011, 08:46:54
It works for me. You may be doing something in here:

Code: [Select]
EngineDirectory.init();
LWJGLNatives.load();

that changes the GL state in a way that breaks the tutorial. Try to comment out those two lines.

A couple recommendations:

- You're not going to learn shaders by reading tutorials. Start with the GLSL spec and the corresponding GL API, understand the basics first, then start experimenting and checking out tutorials.
- Try the org.lwjgl.test.opengl.shaders.ShadersTest sample in the LWJGL test package. It's old code (based on ARB_shader_objects etc), but it has proper error checking and should be easier for you to experiment with.
Title: Re: GLSL Shaders
Post by: CodeBunny on September 19, 2011, 11:05:01
All those lines of code do is ensure that the LWJGL native files are placed in a known directory, and then sets the "org.lwjgl.librarypath" flag to point to that directory. That way, I can dynamically manage the files at runtime, and have significant more flexibility.

I commented it out, and manually set the parameter, and no change occurred.
Title: Re: GLSL Shaders
Post by: Chuck on September 30, 2011, 19:36:32
While I agree the existing tutorials aren't adequate, it helps to start from a minimal working example.  I fixed the compile problem on the wiki, plus a logic issue that was causing it to fail.  I'm still not a fan of the way the code is organized, but it at least works now.

I have a single-file versions of that tutorial here, as well as a conversion that uses GL20 instead of ARB extensions.

https://bitbucket.org/chuck/lwjgl-sandbox/src/tip/src/main/java/tutorials/wiki

Title: Re: GLSL Shaders
Post by: CodeBunny on October 05, 2011, 11:36:15
AWESOME. Shaders work for me now. Thank you so very much, that was bothering.

Now, I still have to learn how to program with them. But it's surmountable now! :D
Title: Re: GLSL Shaders
Post by: CodeBunny on October 05, 2011, 12:47:19
So, I'm working on a shader that will function very much like the built-in blending modes, but needs some more complex functionality.

Right now I have the following vertex shader:
Code: [Select]
void main()
{
gl_FrontColor = gl_Color;
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_Position = ftransform();
}

and fragment shader:
Code: [Select]
uniform sampler2D tex;

void main()
{
vec4 srcColor = texture2D(tex, gl_TexCoord[0].st) * gl_Color;
vec4 dstColor = gl_FragColor;
gl_FragColor = dstColor + srcColor;
}

However, in the fragment shader I want to be able to know what the destination color (i.e., the color I'm overwriting) is. How do I do that?
Title: Re: GLSL Shaders
Post by: spasi on October 05, 2011, 16:10:38
The short answer is: you can't. Read this (https://fgiesen.wordpress.com/2011/07/12/a-trip-through-the-graphics-pipeline-2011-part-9/) for the long answer. The closest you can have to programmable blending on current hardware is NV_texture_barrier (http://www.opengl.org/registry/specs/NV/texture_barrier.txt).
Title: Re: GLSL Shaders
Post by: CodeBunny on October 05, 2011, 18:08:13
There's no way to know what color I'll be overwriting? I just want to know the color of the pixel I'll be rendering over.
Title: Re: GLSL Shaders
Post by: Chuck on October 05, 2011, 18:46:55
Blending is a still a big blind spot in the programmable pipeline and something you still have to rely on the built-in blend modes to handle it for the moment.  A uniform buffer might be a way to go about it now, though I can't speak from any actual experience there.
Title: Re: GLSL Shaders
Post by: CodeBunny on October 05, 2011, 19:35:05
Uniform buffer?
Title: Re: GLSL Shaders
Post by: Chuck on October 05, 2011, 22:51:16
Uniform buffer?

Big arrays of structs you can use as uniforms.  http://www.opengl.org/wiki/Uniform_Buffer_Object
Title: Re: GLSL Shaders
Post by: spasi on October 06, 2011, 08:46:23
There's no way to know what color I'll be overwriting? I just want to know the color of the pixel I'll be rendering over.

As I said, you can't know that color. If you had access to that information, the synchronization implications would totally kill GPU performance. It's all explained in the blog post I linked above.
Title: Re: GLSL Shaders
Post by: CodeBunny on October 06, 2011, 23:34:52
*sigh* Alrighty.

Hmm... the shader effect I need is suddenly going to take a lot more thought...
Title: Re: GLSL Shaders
Post by: Kova on October 10, 2011, 16:07:17
I created FBO and rendered the whole scene to a texture in FBO... then used that texture in the shader. Works.
Title: Re: GLSL Shaders
Post by: CodeBunny on October 10, 2011, 20:06:57
I created FBO and rendered the whole scene to a texture in FBO... then used that texture in the shader. Works.

I am already doing this to achieve a different effect. The problem is that I need this shader to be active while I am rendering to the FBO.

My overall goal here is to create a 2D lighting/motion blur system. Basically, I have have four framebuffers which handle various aspects of the scene being rendered. These are:

All of this is working, and is wonderfully fast (I've had a (admittedly static) scene with 1000 lights render in under a millisecond - since I manage lighting this way, rendering a light is exactly as fast as rendering a 2D sprite). There's also a lot of other things I can do (rendering shadows is just as easy, I just don't use additive rendering, etc).

However, the standard glBlendFunc options are not working optimally when I am rendering the foreground.

My basic blending function for the is glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA). For the color channels, this is exactly what I want. However, this manifests issues when I render pixels with alpha that is neither zero nor one.

Consider a simple case where I render a ghost in front of a brick wall. The ghost has an alpha of 0.5f. The wall has an alpha of 1.0. So, if I plug these alpha values into my blending function, the output is:
Code: [Select]
SRC_ALPHA * SRC_ALPHA + DST_ALPHA * ONE_MINUS_SRC_ALPHA = (0.5f * 0.5f) + (1.0 * 0.5f) = 0.75f

See? It makes perfect sense, but it's an unsatisfactory result. The user would expect the image to still have full alpha where they overlap - however, instead, he'll be able to see some of the background when I render the final texture over the background. He'll not just see through the ghost, he'll also see through the wall.

I wanted to use a shader to basically program an appropriate blending function. The algorithm I think will work is something along the lines of:
Code: [Select]
FINAL_ALPHA = DST_ALPHA + ((1.0f - DST_ALPHA) * SRC_ALPHA);

So, for the ghost/wall example, I would have:
Code: [Select]
1.0 + ((1.0f - 1.0) * 0.5f) = 1.0f

That way, the alpha level can't be actually lowered, and it should follow commonsense patterns when transparency is drawn over transparency (Two 0.5f alpha pixels rendered at the same time would result in a pixel of 0.75f alpha, etc).

SO, IN CONCLUSION: Anybody have any ideas how I could make this work? Currently I'm stumped for options on this final issue.
Title: Re: GLSL Shaders
Post by: spasi on October 11, 2011, 09:44:05
It sounds like premultiplied alpha will solve your problem. Read this post (http://tinyurl.com/586nb8) for details. Keep in mind you don't need to modify your textures, you can do the alpha pre-multiplication in the shader.
Title: Re: GLSL Shaders
Post by: CodeBunny on October 11, 2011, 11:14:56
...That might actually work. Let me test it.
Title: Re: GLSL Shaders
Post by: CodeBunny on October 11, 2011, 11:35:17
It does! Great. :D

EDIT: Actually, the problem persists - Premultiplied alpha simple reduces the result significantly. Let me think about this some more.

EDIT#2: Actually, no, this is weird - my blending function won't take. Let me check my blending code.
Title: Re: GLSL Shaders
Post by: CodeBunny on October 11, 2011, 12:03:55
Nevermind, it works completely. I was making a dumb mistake.

Thank you very much, spasi!
Title: Re: GLSL Shaders
Post by: Suds on October 17, 2011, 06:25:33
While I agree the existing tutorials aren't adequate, it helps to start from a minimal working example.  I fixed the compile problem on the wiki, plus a logic issue that was causing it to fail.  I'm still not a fan of the way the code is organized, but it at least works now.

I have a single-file versions of that tutorial here, as well as a conversion that uses GL20 instead of ARB extensions.

https://bitbucket.org/chuck/lwjgl-sandbox/src/tip/src/main/java/tutorials/wiki



I was just working through this tutorial, and I was unable to get it to work. So I went over it, line by line. And I found that the printLogInfo method returns true if there was an error, and false if there was not.

Then peppered through the tutorial, there's "if(!printLogInfo(obj)) useShader = false".

In draw, "if(useShader) //use the shader". This translates to english roughly as:

"If there was not not an error, dont use shaders." or, "Only use shaders if there as an error." (I think. double/triple negatives get confusing after a while)

The code worked when I reversed the return values of printLogInfo().
Title: Re: GLSL Shaders
Post by: Chuck on October 17, 2011, 17:11:18
Code: [Select]
       if (length > 1) {
            // We have some info we need to output.
            ByteBuffer infoLog = BufferUtils.createByteBuffer(length);
            iVal.flip();
            ARBShaderObjects.glGetInfoLogARB(obj, iVal, infoLog);
            byte[] infoBytes = new byte[length];
            infoLog.get(infoBytes);
            String out = new String(infoBytes);
            System.out.println("Info log:\n"+out);
        }
        else return true;
        return false;

It in fact does return false if there was an error and true if it succeeded.  The code style of this method in the wiki tutorial would result in heavy objects being thrown at me if it my peers went over it in a code review.  I really need to fix it, and I suppose update the wiki page too.  I didn't write the tutorial, so the fixes I actually want to apply probably wouldn't respect the author's intent.  Cleaning up the logic here would probably be okay tho.

This may be a more readable demo for shaders, but it does abstract a lot out: https://bitbucket.org/chuck/lwjgl-sandbox/src/tip/src/main/java/sandbox/misc/HelloShader.java