[SOLVED] Detecting max. supported anti-aliasing samples

Started by Wasserleiche, July 25, 2011, 12:32:23

Previous topic - Next topic

Wasserleiche

Hi

I want to automatically detect the maximum supported AA samples on the mashine my game runs on. I searched here and found that you could create these Pbuffers and look if it throws an exception. Here is my code:
PixelFormat format = new PixelFormat(32, 0, 8, 4, 0);
Pbuffer buffer = new Pbuffer(width, height, format, null);


This is executed before any window is initiated (I'm using JME2 and the latest LWJGL version btw.). Here samples is set to zero and it runs fine. But if I set it to any other value, like 2, I always get this:
org.lwjgl.LWJGLException: Failed to find ARB pixel format 1 0

	at org.lwjgl.opengl.WindowsPbufferPeerInfo.nCreate(Native Method)
	at org.lwjgl.opengl.WindowsPbufferPeerInfo.<init>(WindowsPbufferPeerInfo.java:47)
	at org.lwjgl.opengl.WindowsDisplay.createPbuffer(WindowsDisplay.java:619)
	at org.lwjgl.opengl.Pbuffer.createPbuffer(Pbuffer.java:234)
	at org.lwjgl.opengl.Pbuffer.<init>(Pbuffer.java:219)
	at org.lwjgl.opengl.Pbuffer.<init>(Pbuffer.java:190)
	at org.lwjgl.opengl.Pbuffer.<init>(Pbuffer.java:166)

I definitely know that my graphics card (Geforce 9600 GT) supports AA up to 8x, so what goes wrong here?
If I set AA to 8 and let JME2 use it, it works fine.
Is there maybe another way to find out the supported AA factor?

spasi

Not sure what's happening here, probably a driver weirdness. Try a more standard PixelFormat, like (32, 0, 24, 8, samples).

An alternative would be creating a single-sampled context, then querying GL_MAX_SAMPLES (requires GL3.0+, ARB_framebuffer_object or EXT_framebuffer_multisample).

Wasserleiche

Thanks for your response.
I tried the different PixelFormat, but it's the same story there.
Could you explain how to do your second tip? I'm kinda new to OpenGL programming, so a quick guide would help me a lot :)

Another short question: How can I check whether ant aliasing is supported at all from the graphics card? I undestand that GLContext.getCapabilities().GL_ARB_multisample holds this information when you've create a Display, but how can you determine this before you start the game? Because this is where I want to do this.

spasi

The problem with OpenGL is that you can't query the context capabilities before actually creating the context. So, the only way is to create a dummy context, query, destroy the dummy context, then create the real context you want to do rendering with. This is what actually happens internally in LWJGL whenever you do Display.create or new Pbuffer.

So, with that in mind, querying GL_MAX_SAMPLES is as simple as:

Pbuffer pb = new Pbuffer(...whatever works...);
pb.makeCurrent();
int maxSamples = glGetInteger(GL_MAX_SAMPLES);
pb.destroy();
// now you can do Display.create with a proper multisampled PixelFormat

Wasserleiche

Wow thanks, this works great!
For anyone who wants to see the complete code:
PixelFormat format = new PixelFormat(32, 0, 24, 8, 0);
Pbuffer pb = new Pbuffer(width, height, format, null);
pb.makeCurrent();
boolean supported = GLContext.getCapabilities().GL_ARB_multisample;
if(supported) MAX_SAMPLES = GL11.glGetInteger(GL30.GL_MAX_SAMPLES);
pb.destroy();


Here I first check if AA is supported at all. MAX_SAMPLES is a global variable I use, so if it isn't supported this is zero by default.
Thanks for your help!