Disabling checkGLError()

Started by Rene, August 19, 2009, 22:35:34

Previous topic - Next topic

Rene

Hi everyone,

While i was profiling my application, I noticed a large part of the CPU time was spent on checking for GL errors. Here's an image of the top of the list:


When I take a look at the stack traces it says that checkGLError(), which calls glGetError(), is called after every OpenGL command.

Together, glGetError() and checkGLError() take more than 10% of the CPU time in my test application. That's fine during development, but it would be nice if i could disable the error checking in a distribution: it would mean a significant performance gain.

When searching the forum, I found the following (old) thread, which describes plans for such functionality:
http://lwjgl.org/forum/index.php/topic,369.msg2870.html#msg2870

But the functionality discussed in that thread is not available in the current LWJGL version. Also, I couldn't find anything similar in the library. Does anybody know if it is possible to disable error checking?

Rene
When I am king, they shall not have bread and shelter only, but also teachings out of books, for a full belly is little worth where the mind is starved - Mark Twain


Rene

Now that was a quick reply :)
I've made two different configurations now, one release and one debug config. Everything works perfectly, and the release config has an performance gain from about 5%. Thanks for your help.

Rene
When I am king, they shall not have bread and shelter only, but also teachings out of books, for a full belly is little worth where the mind is starved - Mark Twain