Hello Guest

Weird behavior with shaders.

  • 8 Replies
  • 8636 Views
Weird behavior with shaders.
« on: October 18, 2012, 09:41:53 »
Hi,

Sorry for the bad english.
I'm learning how to use LWJGL, been through many tutorias from the wiki and using what I'm learning in personnal projects.
Recently, i've been through the "OpenGL 3.2 and newer" section of the wiki, thinking I was ready for a non deprecated way of doing things.

Everything went fine util shaders came in to make a colored quad.

while I should get that :


I get this :



I tried to play around with data, it's like position and color are all mixed up...

when I comment the setupShaders() call, I have a well positionned quad :



I think my problem is related to shaders, but I don't know how to know what's my problem with that, any help would be appreciated.

here are my vertex.glsl and fragment.glsl (copy/paste form the tuto):
Code: [Select]
vertex.glsl
#version 150 core

in vec4 in_Position;
in vec4 in_Color;

out vec4 pass_Color;

void main(void) {
gl_Position = in_Position;
pass_Color = in_Color;
}

fragment.glsl
#version 150 core

in vec4 pass_Color;

out vec4 out_Color;

void main(void) {
out_Color = pass_Color;
}

here is the code i've running to make tests (copy/paste from the tuto, just added something to get info from opengl)

Code: [Select]
package GLSL;

import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.IntBuffer;

import org.lwjgl.BufferUtils;
import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.ContextAttribs;
import org.lwjgl.opengl.ContextCapabilities;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import org.lwjgl.opengl.GL11;
import org.lwjgl.opengl.GL15;
import org.lwjgl.opengl.GL20;
import org.lwjgl.opengl.GL30;
import org.lwjgl.opengl.GL32;
import org.lwjgl.opengl.GLContext;
import org.lwjgl.opengl.PixelFormat;
import org.lwjgl.util.glu.GLU;

public class test {
// Entry point for the application
public static void main(String[] args) {
new test();
}

// Setup variables
private final String WINDOW_TITLE = "The Quad: colored";
private final int WIDTH = 320;
private final int HEIGHT = 240;
// Quad variables
private int vaoId = 0;
private int vboId = 0;
private int vbocId = 0;
private int vboiId = 0;
private int indicesCount = 0;
// Shader variables
private int vsId = 0;
private int fsId = 0;
private int pId = 0;

public test() {
// Initialize OpenGL (Display)
this.setupOpenGL();

this.setupQuad();
this.setupShaders();

while (!Display.isCloseRequested()) {
// Do a single loop (logic/render)
this.loopCycle();

// Force a maximum FPS of about 60
Display.sync(60);
// Let the CPU synchronize with the GPU if GPU is tagging behind
Display.update();
}

// Destroy OpenGL (Display)
this.destroyOpenGL();
}

public void setupOpenGL() {
// Setup an OpenGL context with API version 3.2
try {
PixelFormat pixelFormat = new PixelFormat();
ContextAttribs contextAtrributes = new ContextAttribs(3, 2);
contextAtrributes.withForwardCompatible(true);
contextAtrributes.withProfileCore(true);

Display.setDisplayMode(new DisplayMode(WIDTH, HEIGHT));
Display.setTitle(WINDOW_TITLE);
Display.create(pixelFormat, contextAtrributes);

GL11.glViewport(0, 0, WIDTH, HEIGHT);
} catch (LWJGLException e) {
e.printStackTrace();
System.exit(-1);
}
glInfo();

// Setup an XNA like background color
GL11.glClearColor(0.4f, 0.6f, 0.9f, 0f);

// Map the internal OpenGL coordinate system to the entire screen
GL11.glViewport(0, 0, WIDTH, HEIGHT);
}

public void setupQuad() {
// Vertices, the order is not important. XYZW instead of XYZ
float[] vertices = {
-0.5f, 0.5f, 0f, 1f,
-0.5f, -0.5f, 0f, 1f,
0.5f, -0.5f, 0f, 1f,
0.5f, 0.5f, 0f, 1f
};
FloatBuffer verticesBuffer = BufferUtils.createFloatBuffer(vertices.length);
verticesBuffer.put(vertices);
verticesBuffer.flip();

float[] colors = {
1f, 0f, 0f, 1f,
0f, 1f, 0f, 1f,
0f, 0f, 1f, 1f,
1f, 1f, 1f, 1f,
};
FloatBuffer colorsBuffer = BufferUtils.createFloatBuffer(colors.length);
colorsBuffer.put(colors);
colorsBuffer.flip();

// OpenGL expects to draw vertices in counter clockwise order by default
byte[] indices = {
0, 1, 2,
2, 3, 0
};
indicesCount = indices.length;
ByteBuffer indicesBuffer = BufferUtils.createByteBuffer(indicesCount);
indicesBuffer.put(indices);
indicesBuffer.flip();

// Create a new Vertex Array Object in memory and select it (bind)
vaoId = GL30.glGenVertexArrays();
GL30.glBindVertexArray(vaoId);

// Create a new Vertex Buffer Object in memory and select it (bind) - VERTICES
vboId = GL15.glGenBuffers();
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboId);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, verticesBuffer, GL15.GL_STATIC_DRAW);
GL20.glVertexAttribPointer(0, 4, GL11.GL_FLOAT, false, 0, 0);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);

// Create a new VBO for the indices and select it (bind) - COLORS
vbocId = GL15.glGenBuffers();
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vbocId);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, colorsBuffer, GL15.GL_STATIC_DRAW);
GL20.glVertexAttribPointer(1, 4, GL11.GL_FLOAT, false, 0, 0);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);

// Deselect (bind to 0) the VAO
GL30.glBindVertexArray(0);

// Create a new VBO for the indices and select it (bind) - INDICES
vboiId = GL15.glGenBuffers();
GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, vboiId);
GL15.glBufferData(GL15.GL_ELEMENT_ARRAY_BUFFER, indicesBuffer, GL15.GL_STATIC_DRAW);
GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);
}

private void setupShaders() {
int errorCheckValue = GL11.glGetError();

// Load the vertex shader
vsId = this.loadShader("src/GLSL/vertex.glsl", GL20.GL_VERTEX_SHADER);
// Load the fragment shader
fsId = this.loadShader("src/GLSL/fragment.glsl", GL20.GL_FRAGMENT_SHADER);

// Create a new shader program that links both shaders
pId = GL20.glCreateProgram();
GL20.glAttachShader(pId, vsId);
GL20.glAttachShader(pId, fsId);
GL20.glLinkProgram(pId);

// Position information will be attribute 0
GL20.glBindAttribLocation(pId, 0, "in_Position");
// Color information will be attribute 1
GL20.glBindAttribLocation(pId, 1, "in_Color");

GL20.glValidateProgram(pId);

errorCheckValue = GL11.glGetError();
if (errorCheckValue != GL11.GL_NO_ERROR) {
System.out.println("ERROR - Could not create the shaders:" + GLU.gluErrorString(errorCheckValue));
System.exit(-1);
}
}

public void loopCycle() {
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT);

GL20.glUseProgram(pId);

// Bind to the VAO that has all the information about the vertices
GL30.glBindVertexArray(vaoId);
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);

// Bind to the index VBO that has all the information about the order of the vertices
GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, vboiId);

// Draw the vertices
GL11.glDrawElements(GL11.GL_TRIANGLES, indicesCount, GL11.GL_UNSIGNED_BYTE, 0);

// Put everything back to default (deselect)
GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);
GL30.glBindVertexArray(0);
GL20.glUseProgram(0);
}

public void destroyOpenGL() {
// Delete the shaders
GL20.glUseProgram(0);
GL20.glDetachShader(pId, vsId);
GL20.glDetachShader(pId, fsId);

GL20.glDeleteShader(vsId);
GL20.glDeleteShader(fsId);
GL20.glDeleteProgram(pId);

// Select the VAO
GL30.glBindVertexArray(vaoId);

// Disable the VBO index from the VAO attributes list
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);

// Delete the vertex VBO
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
GL15.glDeleteBuffers(vboId);

// Delete the color VBO
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
GL15.glDeleteBuffers(vbocId);

// Delete the index VBO
GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);
GL15.glDeleteBuffers(vboiId);

// Delete the VAO
GL30.glBindVertexArray(0);
GL30.glDeleteVertexArrays(vaoId);

Display.destroy();
}

public int loadShader(String filename, int type) {
StringBuilder shaderSource = new StringBuilder();
int shaderID = 0;

try {
BufferedReader reader = new BufferedReader(new FileReader(filename));
String line;
while ((line = reader.readLine()) != null) {
shaderSource.append(line).append("\n");
}
reader.close();
} catch (IOException e) {
System.err.println("Could not read file.");
e.printStackTrace();
System.exit(-1);
}

shaderID = GL20.glCreateShader(type);
GL20.glShaderSource(shaderID, shaderSource);
GL20.glCompileShader(shaderID);

return shaderID;
}

    private void glInfo() {
System.out.println("\nGL RENDERER: " + GL11.glGetString(GL11.GL_RENDERER));
System.out.println("GL VENDOR: " + GL11.glGetString(GL11.GL_VENDOR));
System.out.println("GL VERSION: " + GL11.glGetString(GL11.GL_VERSION));

ContextCapabilities caps = GLContext.getCapabilities();

if ( caps.OpenGL32 ) {
IntBuffer buffer = ByteBuffer.allocateDirect(16 * 4).order(ByteOrder.nativeOrder()).asIntBuffer();

GL11.glGetInteger(GL32.GL_CONTEXT_PROFILE_MASK, buffer);
int profileMask = buffer.get(0);

System.out.println("\nPROFILE MASK: " + Integer.toBinaryString(profileMask));

System.out.println("CORE PROFILE: " + ((profileMask & GL32.GL_CONTEXT_CORE_PROFILE_BIT) != 0));
System.out.println("COMPATIBILITY PROFILE: " + ((profileMask & GL32.GL_CONTEXT_COMPATIBILITY_PROFILE_BIT) != 0));
}

System.out.println("\nOpenGL 3.0: " + caps.OpenGL30);
System.out.println("OpenGL 3.1: " + caps.OpenGL31);
System.out.println("OpenGL 3.2: " + caps.OpenGL32);
System.out.println("OpenGL 3.3: " + caps.OpenGL33);
System.out.println("OpenGL 4.0: " + caps.OpenGL40);
System.out.println("ARB_compatibility: " + caps.GL_ARB_compatibility);
}
}

here is the info opengl gave me :

Code: [Select]
GL RENDERER: ATI Radeon HD 4600 Series
GL VENDOR: ATI Technologies Inc.
GL VERSION: 3.2.11653 Core Profile Context

PROFILE MASK: 1
CORE PROFILE: true
COMPATIBILITY PROFILE: false

OpenGL 3.0: true
OpenGL 3.1: true
OpenGL 3.2: true
OpenGL 3.3: false
OpenGL 4.0: false
ARB_compatibility: true

My graphic drivers are up to date, and i'm running a debian wheezy64 with jdk1.7.0_07.

Thanks.

Re: Weird behavior with shaders.
« Reply #1 on: October 18, 2012, 10:35:19 »
I managed to render this correctly, but I d'ont understand why it works...

When I buffer the data into the vbos, i give the color floatbuffer to the vertices vbo and the vertice floatbuffer to the color vbo...

weird... :/

Re: Weird behavior with shaders.
« Reply #2 on: October 18, 2012, 11:17:47 »
The order of your vertices is important!
Code: [Select]
float[] vertices = {
-0.5f, 0.5f, 0f, 1f,
-0.5f, -0.5f, 0f, 1f,
0.5f, -0.5f, 0f, 1f,
0.5f, 0.5f, 0f, 1f
};
That would be interpreted as Left-Top, Left-Bottom, Right-Bottom, Right-Top. That is a square, not two triangles and will result in what your result image display ;)

Re: Weird behavior with shaders.
« Reply #3 on: October 18, 2012, 11:54:28 »
Hi, thank you for your answer,

aren't indices made for that purpose?

Code: [Select]
byte[] indices = {
0, 1, 2,
2, 3, 0
};

like first triangle = left-top, left-bottom, right-bottom and second triangle = right-bottom, right-top, left-top, appearing as a rectangle?

Therefore, when i don't render colors, the two triangles with vertices used twice render correctly (white but at the correct position), so i'm pretty sure my positions and indices are correct.

The most weird of all is  that when i buffer color data into vertice vbo, and buffer vertice data into color vbo, this renders correctly, position and colors (without changind anything in my float[].

Re: Weird behavior with shaders.
« Reply #4 on: October 18, 2012, 16:08:41 »
Just found a related post on the forum. It gives the solution to my problem.

http://lwjgl.org/forum/index.php?topic=4733.0

Re: Weird behavior with shaders.
« Reply #5 on: January 29, 2013, 04:50:21 »
I hate to revive a dead thread -  but I am running into the same issue - but when I move  GL20.glLinkProgram(pId) after :

   
Code: [Select]
// Position information will be attribute 0
GL20.glBindAttribLocation(pId, 0, "in_Position");
// Color information will be attribute 1
GL20.glBindAttribLocation(pId, 1, "in_Color");

I get a blank screen.  Even switching the shader vars works with linkProgram after BindAttribs, but the quad still looks like the image shown above.

Anyone have an idea what's going on? Can I do this without putting position/color in the wrong attribute?

*

Offline quew8

  • *****
  • 569
  • Because Square Eyes Look More Real
Re: Weird behavior with shaders.
« Reply #6 on: January 29, 2013, 19:17:55 »
Have you called glGetError()? From the docs:
Quote
GL_INVALID_OPERATION is generated if program is not a program object.
. Could be causing black screen if shaders are not properly created and linked. Have you enabled all the vertex atrributes you're using (glEnableVertexAttribArray)
Quote
Active attributes that are not explicitly bound will be bound by the linker when glLinkProgram is called.
If your attributes are not bound then you'll be using default values which I believe tend to be 0. Also you can query the location of an attribute with glGetAttribLocation. Why not try it after initialization and make sure they're the values you assigned?

Re: Weird behavior with shaders.
« Reply #7 on: January 30, 2013, 04:37:52 »
I've figured out my issue - switching the color and position attribute index worked, but it was not showing because the "w" value of my position vectors were all set to "0" instead of 1! This is why nothing was showing.

*

Offline quew8

  • *****
  • 569
  • Because Square Eyes Look More Real
Re: Weird behavior with shaders.
« Reply #8 on: January 30, 2013, 23:00:59 »
These pesky homogeneous coordinates right?