Hey! I've been writing a game and engine with LWJGL 2 for a while, and as LWJGL 3 has recently come out have decided to switch over to it. So far the switch has gone smoothly, but while porting the framebuffer code i've run into two issues which seem to me to be library or driver bugs, although I could be wrong (I try to solve all problems via google, so this is my first time on a forum to try to solve a bug).
Computer specs: Note that this bug also appeared on the same computer's Intel integrated 4000 series graphics chipset.
(461345897236914)(Game Thread)> - OS: Windows 7 (windows) (amd64)
(461346247664222)(Render Thread)> - OpenGL Version: 3.2.12874 Core Profile Context 14.100.0.0
(461346247827474)(Render Thread)> - - Vendor: ATI Technologies Inc. (AMD Radeon HD 7900 Series)
(461346247955326)(Render Thread)> - - Supported GLSL Version: 4.30
Issue 1:
When creating the texture for a framebuffer with glTexImage2D(...), supplying a format parameter of GL30.GL_DEPTH_STENCIL will immediately cause glGetError to return GL_INVALID_ENUM, regardless of other parameters. This error will immediately lead to a GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT error when the framebuffer is created with these OGL parameters:
GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL30.GL_DEPTH32F_STENCIL8, width, height, 0, GL30.GL_DEPTH_STENCIL, GL11.GL_FLOAT, (ByteBuffer)null);
Issue 2:
When reading gl_FragCoord in a shader while rendering to a framebuffer, nonsense values are given instead: see the attached image, generated with
void main() {
fragColor = vec4(gl_FragCoord.xy/renderSize, 1, 1);
}
Note that I have checked, and renderSize is definitely correct.