LWJGL Forum

Programming => Lightweight Java Gaming Library => Topic started by: Kiyote on December 04, 2010, 14:26:43

Title: Shader crashes NVidia but not ATI
Post by: Kiyote on December 04, 2010, 14:26:43
Sorry for the wall of text, I wanted to try to be as detailed as I could.


So, I'm the first to admit I'm a total newbie when it comes to shaders.  I'm trying very hard to teach myself how to make them work and I thought I had finally put together what I wanted as it works on one machine I use exactly the way I want.  I then brought the app to another machine and it crashes Java.

I tried glslDevil to see if I could try to figure out what was going on (don't know if that's the right tool, googling seemed to indicate it might help me) but it seems to be unable to get past the stage where LWJGL enumerates the extensions. (Not sure, but looking through the glslDevil trace and trying to match that to operations in the the LWJGL code it looks like GLContext)

Anyway, I come here in hopes of finding someone who can say "duh...you idiot, don't do X, do Y instead!"

The vertex shader:

#version 130

in vec4 vPosition;
in vec4 vInColour;
in vec2 vInTex;
in vec3 vInData;
out vec4 vColour;
out vec2 vFragmentTex;

void main() {
    vColour = vInColour;
    vFragmentTex = vInTex;
    mat4 vRot = mat4( cos( vInData.x ), -sin( vInData.x ), 0.0, 0.0,
                      sin( vInData.x ), cos( vInData.x ), 0.0, 0.0,
                      0.0, 0.0, 1.0, 0.0,
                      0.0, 0.0, 0.0, 1.0 );
    mat4 vTrans = mat4( 1.0, 0.0, 0.0, vInData.y,
                        0.0, 1.0, 0.0, vInData.z,
                        0.0, 0.0, 1.0, 0.0,
                        0.0, 0.0, 0.0, 1.0 );
    gl_Position = gl_ModelViewProjectionMatrix * (vRot * vPosition * vTrans);
}


The fragment shader:

#version 130

uniform sampler2D tex;
in vec4 vColour;
in vec2 vFragmentTex;

void main() {
    gl_FragColor = vColour * texture2D( tex, vFragmentTex.st);
}


Now, at all stages I check the status of the compilation and linking, everything seems to go fine.  The error only happens at the stage where I do my glDrawArrays.

I set up my data like this:

// Position
_vertexData.position(0);
GL20.glVertexAttribPointer( _shader.getAttribLocation("vPosition"), 3, false, 48, _vertexData );
GL20.glEnableVertexAttribArray( _shader.getAttribLocation("vPosition") );
// Colour
_vertexData.position(3);
GL20.glVertexAttribPointer( _shader.getAttribLocation("vInColour"), 4, false, 48, _vertexData );
GL20.glEnableVertexAttribArray( _shader.getAttribLocation("vInColour") );
// Texturing
_vertexData.position(7);
GL20.glVertexAttribPointer( _shader.getAttribLocation("vInTex"), 2, false, 48, _vertexData );
GL20.glEnableVertexAttribArray( _shader.getAttribLocation("vInTex") );
// Added data
_vertexData.position(9);
GL20.glVertexAttribPointer( _shader.getAttribLocation("vInData"), 3, false, 48, _vertexData );
GL20.glEnableVertexAttribArray( _shader.getAttribLocation("vInData") );


I don't suppose anyone could help me out?
Title: Re: Shader crashes NVidia but not ATI
Post by: Kiyote on December 05, 2010, 14:47:15
So, I've pared the shaders down to:


#version 130

in vec4 vPosition;

void main() {
    gl_Position = gl_ModelViewProjectionMatrix * vPosition;
}


and

#version 130

void main() {
    gl_FragColor = vec4( 1.0, 1.0, 1.0, 1.0 );
}


and it still crashes.  If I hadn't just upgraded to the very latest drivers, I would be thinking it might be some kind of bug in there, but I guess it's something I'm doing wrong.

I've even pared my data down to:

private float[] vData = { -50.0f, -50.0f, 0.0f,   50.0f, -50.0f, 0.0f,   50.0f, 50.0f, 0.0f };
private FloatBuffer _vData;
...
_vData = ByteBuffer.allocateDirect( (3 * 4) * 3).order( ByteOrder.nativeOrder() ).asFloatBuffer();
_vData.put( vData );
_vData.rewind();
...
GL20.glVertexAttribPointer( _shader.getAttribLocation("vPosition"), 3, false, 0, _vData );
GL20.glEnableVertexAttribArray( _shader.getAttribLocation("vPosition") );


At this point I'm really confused about what I could be doing wrong.

Has anyone encountered anything like this before?
Title: Re: Shader crashes NVidia but not ATI
Post by: Kiyote on December 05, 2010, 20:47:04
I was able to make something very similar work by using VBOs instead.  Perhaps I'll just stick to that.