[CLOSED] drawArrays crashes java (OpenGL/CL interoperation)

Started by Jens, January 10, 2012, 21:53:17

Previous topic - Next topic

Jens

My Program shall render the results of an OpenCL-computation using a custom shader.
But when i call drawArrays using glVertexAttribPointer with type GL_INT, the javaw.exe crashes.

Here some code, which reproduces the crash:
import static org.lwjgl.opencl.CL10.*;
import static org.lwjgl.opengl.GL11.*;
import static org.lwjgl.opengl.GL15.*;
import static org.lwjgl.opengl.GL20.*;

import java.io.*;
import org.lwjgl.opencl.*;
import org.lwjgl.opengl.*;

public class Program {

	private static CLContext clContext;
	private static int glProgram;
	private static int glProperties;
	private static int aProperty;

	public static void main(String[] args) {
		try {	
			Display.create();
			
			CL.create();
			CLPlatform platform = CLPlatform.getPlatforms().get(0);
			clContext = CLContext.createFromType(platform, CL_DEVICE_TYPE_GPU, null, Display.getDrawable(), null);
			clReleaseContext(clContext);
			CL.destroy();
		
			loadShaderProgram("vshader.gl", "fshader.gl");
		} catch (Exception e) {
			e.printStackTrace();
		}
		glProperties = glGenBuffers();
		glBindBuffer(GL_ARRAY_BUFFER, glProperties);
		glBufferData(GL_ARRAY_BUFFER, 4, GL_DYNAMIC_COPY);
			
		while(!Display.isCloseRequested()) {
			glClear(GL_COLOR_BUFFER_BIT);
			glBindBuffer(GL_ARRAY_BUFFER, glProperties);
			glVertexAttribPointer(aProperty, 1, GL_INT, false, 0, 0);
			glDrawArrays(GL_POINTS, 0, 1);
			Display.update();
		}
		Display.destroy();
		
		//FIXME EVIL CODE !!! (this belongs to the other bug i posted)
		try {
			Runtime.getRuntime().exec("TASKKILL /F /FI \"MODULES eq lwjgl64.dll\"");
		} catch (Exception e) {
			e.printStackTrace();
		}
		//FIXME EVIL CODE !!!
	}

	private static void loadShaderProgram(String vsfile, String fsfile)
			throws FileNotFoundException, IOException {
		int vshader = glCreateShader(GL_VERTEX_SHADER);
		glShaderSource(vshader, "attribute int aProperty;" +
							    "varying int vProperty;" +
							    "void main(){"  +
							    "	vProperty = aProperty;" +
							    "	gl_Position = vec4(0.5, 0.5, 0.5, 1);" +
							    "}");
		glCompileShader(vshader);

		int fshader = glCreateShader(GL_FRAGMENT_SHADER);
		glShaderSource(fshader, "varying int vProperty;" +
							    "void main(){" +
							    "	if(vProperty == 0) {" +
							  	"		gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);" +
							  	"	} else {" +
							  	"		gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);" +
							  	"	}" +
							  	"}");
		glCompileShader(fshader);

		glProgram = glCreateProgram();
		glAttachShader(glProgram, vshader);
		glAttachShader(glProgram, fshader);
		glLinkProgram(glProgram);
		glValidateProgram(glProgram);
		
		aProperty = glGetAttribLocation(glProgram, "aProperty");
		glEnableVertexAttribArray(aProperty);
		
		glUseProgram(glProgram);
	}
}


If i comment out the clContext creation, the program works as expected.
If i pass GL_FLOAT instead of GL_INT to glVertexAttribPointer in the render loop, the program works as expected too.
But i don't think that's the right way.

When Java crashes there is only the Windows notification that the program doesn't work anymore.

This bug maybe related to my other bug (http://lwjgl.org/forum/index.php/topic,4374.0.html)
In both cases the clContext influences the OpenGL part of the program, although it should be released.

I am using:
windows 7 (64bit)
java 7 (64bit)
lwjgl 2.8.2

Intel Pentium Dual-Core CPU
ATI Mobility Radeon HD 4300 Series graphics card

I hope you can help me.

spasi

Hey Jens,

I've tried compiling and running both code samples you reported and I cannot reproduce any crash or error. I'm on Win7, Radeon 5870, Java 7u2, latest drivers, latest LWJGL nightly build, tried both x86 and x64 VMs.

Most likely it's a driver problem on your setup. Check your Catalyst CC, what driver version are you running?

Jens

OK, updated the driver and it works

Thanks for your help.