[SOLVED] TheQuadColored tutorial code not displaying properly

Started by bloodsquirrel, September 09, 2012, 16:43:41

Previous topic - Next topic

bloodsquirrel

During an effort to figure out why my program wasn't displaying correctly on my desktop (worked fine on my laptop), I resorted to copying and running one of the tutorial programs on the wiki and running it.

http://www.lwjgl.org/wiki/index.php?title=The_Quad_colored

I copy/pasted the code, changing only that I put the shaders as strings in the program rather than separate files. The code compiled and ran, but showed the two triangles that were supposed to make up the quad in the wrong place and the wrong colors. After screwing around with it, I unearthed some odd behavior.

Apparently, the color buffer being sent to OpenGL is being used for the position, while the position buffer is being used for the colors. Switching the values makes it display correctly, even though doing that should be breaking the code. It appears to be shader-related, since doing anything that prevents the shader from compiling results in a blank white quad showing in the correct position on the screen. The exact same .java file displays correctly on my laptop. Performing the switch that makes it work correctly on my desktop has the expected behavior of making it display incorrectly.

I've attached the exact .java file that I've been running. If anyone can at least reproduce this behavior let me knew. I'm using the same version of the LWJGL on my desktop as my laptop, but I have trouble figuring out how to blame the problem on eclipse or my graphics card drivers (It's too basic a thing for me to be running into a bug while doing).

I'm using:
LWJGL 2.8.4
eclipse Version: Indigo Service Release 2 Build id: 20120216-1857
ATI Radeon HD 5770
8.982.0.0 - Catalyst 12.8 (7-27-2012)
OpenGL 4.2 (ATI Radeon HD 5700 Series with 236 ext.)

princec

Pasted. Worked perfectly. Looks like drivers to me.

Cas :)

kKdH

Hi,

I got exact the same problem on several laptops but i figured out that the problem only exists on ATI graphic cards
on Intel and Nvidia cards it works fine.

greatings kkdh


csam12

I can verify that this:

-Does not work on 6770
-Does not work on 7950
-Does not work on 3000 Series

I've tried eyefinity enabled and disable, single display and multiple display,

every version of drivers from 11.1 to 12.9

This DOES work on Nvidia 93XX
This DOES work on Integrated on i7 3770k intel hd 4000


Here is an example:


princec

Good old ATI. Still trying to drive themselves out of business after all these years, but they just can't quite manage it, no matter how inept they seem to be!

Cas :)

spasi

There's nothing wrong with the AMD drivers, the wiki code has a bug: the 2 calls to glBindAttribLocation need to happen before glLinkProgram. From the spec:

QuoteBindAttribLocation specifies that the attribute variable named name in program program should be bound to generic vertex attribute index when the program is next linked.

It just so happens that NV/Intel drivers bind in_Position to 0 and in_Color to 1 by default, while the AMD one does the opposite. This is not a driver bug in any way; if you let the driver decide the bindings, you're supposed to use glGetAttribLocation after linking the shader program.

Quote from: princec on October 21, 2012, 10:17:35Good old ATI. Still trying to drive themselves out of business after all these years, but they just can't quite manage it, no matter how inept they seem to be!

In my experience AMD drivers have been rock solid for the past 4 years (at least) and that includes Linux drivers. Only the latest GL features are sometimes problematic but that's equally true (or even worse) for NV drivers. AMD drivers have also been an absolute pleasure to develop on since AMD_debug_output was released.

princec

I have more trouble supporting ATI than any other chipset, even Intel :( And I only do low end stuff!

Cas :)

davidstvz

Thanks for answering despite the OP being 6 weeks old!  I am lucky I only tried this code yesterday myself.

Can someone with access update the wiki so others with AMD card do not have this trouble?

bloodsquirrel

Quote from: spasi on October 21, 2012, 16:12:58
There's nothing wrong with the AMD drivers, the wiki code has a bug: the 2 calls to glBindAttribLocation need to happen before glLinkProgram. From the spec:

QuoteBindAttribLocation specifies that the attribute variable named name in program program should be bound to generic vertex attribute index when the program is next linked.

It just so happens that NV/Intel drivers bind in_Position to 0 and in_Color to 1 by default, while the AMD one does the opposite. This is not a driver bug in any way; if you let the driver decide the bindings, you're supposed to use glGetAttribLocation after linking the shader program.

Quote from: princec on October 21, 2012, 10:17:35Good old ATI. Still trying to drive themselves out of business after all these years, but they just can't quite manage it, no matter how inept they seem to be!

In my experience AMD drivers have been rock solid for the past 4 years (at least) and that includes Linux drivers. Only the latest GL features are sometimes problematic but that's equally true (or even worse) for NV drivers. AMD drivers have also been an absolute pleasure to develop on since AMD_debug_output was released.

Thanks for the explanation. I was having a hard time believing a bug affecting such basic functionality could be in ATI's drivers.