Application throws a fatal error when trying to render a rectangle

Started by smokethepot, May 09, 2017, 02:48:04

Previous topic - Next topic

smokethepot

I have been in Java programming for over 8 years now and have been learning and getting better at game related programming for the better part of it but I'm still new to OpenGL (or using any kind of external library other than Java, actually). I'm using LWJGL 3.1.1, BTW.

My first problem with LWJGL was when I tried to create a window and it crashed. I pin-pointed the problem to the createWindow() method itself. I somehow solved that problem by updating the driver on my graphics card.

Now, I'm facing a new problem when I'm trying to render just a rectangle on my screen. I'm using VAOs and VBOs to do such (I've gotten to know about them only a week ago). This time, I've pin-pointed the problem to be the glGenVertexArrays() method. The tutorial I was following calls the GL30.glGenVertexArrays() method for this. I used that and it didn't work so I did a little research and found out that LWJGL 3 has some classes (the ARB classes) which do similar function so I decided to use them. That didn't work either. I use the ARBVertexArrayObject.glGenVertexArrays() method. Following is the console error it spits out.

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x00007ff83169f137, pid=10276, tid=0x000000000000281c
#
# JRE version: Java(TM) SE Runtime Environment (8.0_131-b11) (build 1.8.0_131-b11)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.131-b11 mixed mode windows-amd64 compressed oops)
# Problematic frame:
# C  [lwjgl_opengl.dll+0xf137]
#
# Failed to write core dump. Minidumps are not enabled by default on client versions of Windows
#
# An error report file with more information is saved as:
# C:\Users\smoke\Desktop\Java\LWJGL Engine\hs_err_pid10276.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#


For anybody who's interested, I've uploaded the error log file in the attachment. Please help!!!

spasi

Looks like you're calling an OpenGL function without a context current in the current thread (or without a call to GL.createCapabilities()).

smokethepot

First things first, thanks for the reply.

Secondly, I'm calling GL.createCapablilties() at the beginning of every frame and I think I am creating context current by calling the method GLFW.glfwMakeContextCurrent(int) during the initialization process of my program. Is that right or should I be doing something else?

As a more elaborate description, I'm posting a snippet of my main class as follows:

private void init(int width, int height){
		glfwSetErrorCallback(GLFWErrorCallback.createPrint(System.err));
		
		//
		if(!glfwInit())
			throw new IllegalStateException("Unable to initialise GLFW");
		//Configure GLFW
		glfwWindowHint(GLFW_VISIBLE, GLFW_FALSE);
		//glfwWindowHint(GLFW_RESIZABLE, GLFW_TRUE);
		
		//create the window
		window = glfwCreateWindow(width, height, title, (isFullscreen ? glfwGetPrimaryMonitor() : 0), 0);
		if(window == 0)
			throw new RuntimeException("Failed to create Window");
		
		//
		glfwSetKeyCallback(window, (window, key, scancode, action, mods) ->{
			if(key == GLFW_KEY_ESCAPE && action == GLFW_PRESS)
				glfwSetWindowShouldClose(window, true); //we will detect this in the rendering loop
		});
		
		//get the thread stack and push a new frame
		try(MemoryStack stack = stackPush()){
			//creates int*
			IntBuffer pWidth = stack.mallocInt(1), pHeight = stack.mallocInt(1);
			
			//get the window size passed to glfwCreateWindow
			glfwGetWindowSize(window, pWidth, pHeight);
			
			//get the resolution of the primary monitor
			GLFWVidMode vidMode = glfwGetVideoMode(glfwGetPrimaryMonitor());
			
			//center the window
			glfwSetWindowPos(window, (vidMode.width() - pWidth.get(0))/2, (vidMode.height() - pHeight.get(0))/2);
		} //stack frame is popped automatically
		
		//make the OpenGL context current
		glfwMakeContextCurrent(window);
		//Enable v-sync
		glfwSwapInterval(1);
		
		//make the window visible
		glfwShowWindow(window);
	}
	
	public void run(){
		GL.createCapabilities();
		
		//set the clear color
		glClearColor(red, green, blue, alpha);
		
		while(!glfwWindowShouldClose(window) || !running){
			glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
			//swap the color
			glfwSwapBuffers(window);
			
			glfwPollEvents();
			
			if(!paused)
				update();
			else pausedUpdate();
		}
		dispose();
		System.exit(0);
	}


spasi

Based on the log you attached, your program crashes when it calls glGetString, which is not present in the code above. Maybe you are calling it before the init method? Btw, from the log:

siginfo: ExceptionCode=0xc0000005, reading address 0x00000000000015e0


0x15e0 is the offset of glGetString (700 * 8 bytes on x64) in the function address buffer that LWJGL uses when a context is made current.

Quote from: smokethepot on May 09, 2017, 16:51:52I'm calling GL.createCapablilties() at the beginning of every frame

GL.createCapabilities() is an expensive method and should only be called when a new context is created (and made current with glfwMakeContextCurrent). After that, the returned GLCapabilities instance should be cached and reused with GL.setCapabilities(). But generally you do not need to, unless you're dealing with multiple contexts and/or multiple threads.

smokethepot

Thanks for the GL.createCapabilities() performance advice. I'll keep that in mind.

I commented out the glGetString line and it didn't work. Actually, I put that line to diagnose the same problem. I read somewhere that this problem sometimes occurs if you don't have OpenGL 3.0 or greater and it was advised that I check out the version I was using by using that command.

BTW, I only have a problem in rendering the rectangle on the screen. If I comment that code out, all else is working. The window appears, I can change the clear color, I can make the screen windowed or fullscreen. All of that stuff is working perfectly.

I'll attach some of the rectangle code as follows:

First the snippet of the main class that defines and initializes the relevant objects
public RawLoader loader;
public MasterRenderer renderer;
public float[] vertices = {
		//left bottom triangle
		-0.5F, 0.5F, 0F,
		-0.5F, 0.5F, 0F,
		0.5F, 0.5F, 0F,
		//right top triangle
		0.5F, -0.5F, 0F,
		0.5F, 0.5F, 0F,
		-0.5F, 0.5F, 0F};
public RawModel model;

...
loader = new RawLoader();
renderer = new MasterRenderer();
model = loader.loadToVAO(vertices);


RawModel class
public class RawModel {
	
	private int vaoId, vertexCount;
	
	public RawModel(int vaoId, int vertexCount){
		this.vaoId = vaoId;
		this.vertexCount = vertexCount;
	}

	public int getVaoId() {
		return vaoId;
	}

	public int getVertexCount() {
		return vertexCount;
	}
	
}


Loader class
public class RawLoader {
	
	private List<Integer> vaos = new ArrayList<Integer>(), vbos = new ArrayList<Integer>();
	
	public RawModel loadToVAO(float[] positions){
		int vaoID = createVAO();
		storeDataInAttributeList(0, positions);
		unbindVAO();
		return new RawModel(vaoID, positions.length/3);
	}
	
	private int createVAO(){
		/*int vaoID = GL30.glGenVertexArrays();
		vaos.add(vaoID);
		GL30.glBindVertexArray(vaoID);
		return vaoID;*/
		int vaoID = ARBVertexArrayObject.glGenVertexArrays();
		vaos.add(vaoID);
		ARBVertexArrayObject.glBindVertexArray(vaoID);
		return vaoID;
	}
	
	private void storeDataInAttributeList(int attributeNumber, float[] data){
		/*int vboID = GL15.glGenBuffers();
		vbos.add(vboID);
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboID);
		FloatBuffer buffer = storeDataInFloatBuffer(data);
		GL15.glBufferData(GL15.GL_ARRAY_BUFFER, buffer, GL15.GL_STATIC_DRAW);
		GL20.glVertexAttribPointer(attributeNumber, 3, GL11.GL_FLOAT, false, 0, 0);
		GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);*/
		int vboID = ARBVertexBufferObject.glGenBuffersARB();
		vbos.add(vboID);
		ARBVertexBufferObject.glBindBufferARB(ARBVertexBufferObject.GL_ARRAY_BUFFER_ARB, vboID);
		FloatBuffer buffer = storeDataInFloatBuffer(data);
		ARBVertexBufferObject.glBufferDataARB(ARBVertexBufferObject.GL_ARRAY_BUFFER_ARB, buffer, ARBVertexBufferObject.GL_STATIC_DRAW_ARB);
		ARBVertexAttrib64Bit.glVertexAttribLPointer(attributeNumber, 3, GL11.GL_FLOAT, 0, 0);
		ARBVertexBufferObject.glBindBufferARB(ARBVertexBufferObject.GL_ARRAY_BUFFER_ARB, 0);
	}
	
	private void unbindVAO(){
		//GL30.glBindVertexArray(0);
		ARBVertexArrayObject.glBindVertexArray(0);
	}
	
	private FloatBuffer storeDataInFloatBuffer(float[] data){
		FloatBuffer buffer = BufferUtils.createFloatBuffer(data.length);
		buffer.put(data);
		buffer.flip();
		return buffer;
	}
	
	public void dispose(){
		for(int i = 0; i < vaos.size(); i++)
			//GL30.glDeleteVertexArrays(vaos.get(i));
			ARBVertexArrayObject.glDeleteVertexArrays(vaos.get(i));
		
		for(int i = 0; i < vbos.size(); i++)
			//GL15.glDeleteBuffers(vbos.get(i));
			ARBVertexBufferObject.glDeleteBuffersARB(vbos.get(i));
	}
	
}


And finally, the MasterRenderer class
public class MasterRenderer {
	
	public void render(RawModel model){
		/*GL30.glBindVertexArray(model.getVaoId());
		GL20.glEnableVertexAttribArray(0);
		GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, model.getVertexCount());
		GL20.glDisableVertexAttribArray(0);
		GL30.glBindVertexArray(0);*/
		ARBVertexArrayObject.glBindVertexArray(model.getVaoId());
		GL20.glEnableVertexAttribArray(0);
		GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, model.getVertexCount());
		GL20.glDisableVertexAttribArray(0);
		ARBVertexArrayObject.glBindVertexArray(0);
	}


As you can see from the code that I've commented out a lot of OpenGL static method calls. The tutorial I followed used this approach and when I was facing problems I found the ARB classes which seemed to function as cover methods for OpenGL methods. I am unsure as to which is the correct method. Can you also advice me on this? What's the difference between these two approaches?

spasi

Quote from: smokethepot on May 10, 2017, 02:36:00I am unsure as to which is the correct method. Can you also advice me on this? What's the difference between these two approaches?

ARB_vertex_array_object was released at the same time as OpenGL 3.0. It exists for hardware/drivers that are able to support/implement it, but cannot support the full OpenGL 3.0 feature set. So, you use it when GL30 is not available. Example code:

// create window/context, make current
GLCapabilities caps = GL.createCapabilities();

if (caps.GL30) {
    GL30.glBindVertexArray(vao);
} else if (caps.GL_ARB_vertex_array_object) {
   ARBVertexArrayObject.glBindVertexArray(vao);
} else {
   throw new UnsupportedOperationException("Vertex array objects are not available in this context!");
}


That's the general idea of course, the above is usually abstracted away in the renderer implementation.

At a higher level, you need to have a minimum requirement for the base OpenGL version. Say, you want your game to run on OpenGL 2.1 or higher, or OpenGL 3.2 core or higher, etc. So, you use the corresponding hints when creating the GLFW window/context. If context creation fails, you abort and display a useful message to the user. If it succeeds, you can use all functionality that is available up to that version, unconditionally. For anything else, you have to guard it with GLCapabilities flags and think about fallbacks that must be implemented when a particular feature is not available.

smokethepot

Thanks, that's great. My "caps.GL30" check came back as true. That means that GL30.glGenVertexArrays() should work on my computer, right? Then why does it fail. Do you have any leads from the source code I attached or am I missing something? Thanks for all the help till now.

I have another question. If caps.GL30 command returns true, does that directly mean that my GPU supports and has the capability to run OpenGL 3.0?

spasi

Quote from: smokethepot on May 12, 2017, 00:52:18That means that GL30.glGenVertexArrays() should work on my computer, right? Then why does it fail.

So far you have not produced evidence that's the function that fails. Generally, this is how you should approach OpenGL debugging:

- Run your program with -Dorg.lwjgl.util.Debug=true
- Use glfwWindowHint(GLFW_OPENGL_DEBUG_CONTEXT, GLFW_TRUE)
- Call GLUtil.setupDebugMessageCallback() after GL.createCapabilities()

This will ensure you're seeing OpenGL errors as soon as possible. You may have a bug early in your program that triggers a crash at a later time. If that doesn't help, try to simplify your code until you can pinpoint the exact issue.

Quote from: smokethepot on May 12, 2017, 00:52:18Do you have any leads from the source code I attached or am I missing something?

Sorry but, I have this personal rule to not read code posted on the forum. It's not that I don't want to help, but I have limited time and it's been very time consuming in the past. Most users have trouble with correct API usage, not with LWJGL itself. There are enough resources available (OpenGL tutorials, OpenGL forums, etc) for such users to get help. There are very nice people that frequent this forum and may help, but keep in mind that teaching users OpenGL or Vulkan or rendering techniques, is not a goal of LWJGL.

With that said, I don't mind running simple programs that reproduce an issue. The problem is usually found much quicker that way. The source of a standalone class that I can simply copy/paste and run, without any dependencies other than LWJGL itself, is best.

Quote from: smokethepot on May 12, 2017, 00:52:18If caps.GL30 command returns true, does that directly mean that my GPU supports and has the capability to run OpenGL 3.0?

Yes.

smokethepot

Hi again.
Quote from: spasi on May 12, 2017, 06:55:46
So far you have not produced evidence that's the function that fails.

To find out where was the problem in the program, I simply followed the natural flow of the application and logged out a print command after every other line. If a print command has been executed, the line preceding it has not crashed. The program runs fine till it reaches the line GL30.glGenVertexArrays(). Any print command before this runs fine but the same after this line doesn't print anything.

As far as the not going through the source code rule goes, I'm completely respectful of your time.

Quote from: spasi on May 12, 2017, 06:55:46
Generally, this is how you should approach OpenGL debugging:

- Run your program with -Dorg.lwjgl.util.Debug=true
- Use glfwWindowHint(GLFW_OPENGL_DEBUG_CONTEXT, GLFW_TRUE)
- Call GLUtil.setupDebugMessageCallback() after GL.createCapabilities()

I have used this approach and it comes up with the following message:

[LWJGL] A function that is not available in the current context was called. The JVM will abort execution. Inspect the crash log to find the responsible Java frames.


and it crashes same as before with the following fatal error message:

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x00007ff9c775248d, pid=5316, tid=0x00000000000006c8
#
# JRE version: Java(TM) SE Runtime Environment (8.0_131-b11) (build 1.8.0_131-b11)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.131-b11 mixed mode windows-amd64 compressed oops)
# Problematic frame:
# C  [lwjgl.dll+0x1248d]
#
# Failed to write core dump. Minidumps are not enabled by default on client versions of Windows
#
# An error report file with more information is saved as:
# C:\Users\smoke\Desktop\Java\LWJGL Engine\hs_err_pid5316.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#

spasi



Kai

Did you request a GL >= 3.2 core context? If so, then immediate mode GL functions, such as glBegin/glEnd, are not available.
Please read the OpenGL 3.2 Core Profile specification (or the specific OpenGL context version you are requesting with GLFW).

smokethepot

Yes, I was requesting OpenGL 3.2 core and was also calling immediate mode GL functions but that was only because the programmable pipeline wasn't working and was displaying that error that I had originally posted. I have removed those fixed functions since then but I'm getting those same errors.

I will go over the specifications you posted. Thanks for that.

I am following this tutorial (https://www.youtube.com/watch?v=WMiggUPst-Q) when I'm trying to create a rectangle on my screen. He's using LWJGL 2.x but I'm not using any of his window creation code. That comes straight from the LWJGL page (https://www.lwjgl.org/guide). Could this be creating any conflicts?

smokethepot

Thank you everybody for everything. I seem to have fixed the problem on my own. It was some extremely stupid oversight on my part. Thanks for all the perseverance you've shown with my stupid question. I will definitely be active on this site.

mudlee

Spasi, it's a littlebit offtopic, but do you have any documentation about function addresses? Would be super good if'll be able to use a table and parse siginfos like that :)

Quote from: spasi on May 09, 2017, 19:22:59
Based on the log you attached, your program crashes when it calls glGetString, which is not present in the code above. Maybe you are calling it before the init method? Btw, from the log:

siginfo: ExceptionCode=0xc0000005, reading address 0x00000000000015e0


0x15e0 is the offset of glGetString (700 * 8 bytes on x64) in the function address buffer that LWJGL uses when a context is made current.

Quote from: smokethepot on May 09, 2017, 16:51:52I'm calling GL.createCapablilties() at the beginning of every frame

GL.createCapabilities() is an expensive method and should only be called when a new context is created (and made current with glfwMakeContextCurrent). After that, the returned GLCapabilities instance should be cached and reused with GL.setCapabilities(). But generally you do not need to, unless you're dealing with multiple contexts and/or multiple threads.