[FIXED] add OpenGL 3.2 Core support for Mac OS X Lion

Started by void256, July 22, 2011, 21:41:30

Previous topic - Next topic

void256

Apple finally woke up and allows at least OpenGL 3.2 Core Profile on Mac OS X Lion!  :D

It must be explicitly enabled when creating the NSOpenGLPixelFormat using the following two NSOpenGLPixelFormatAttributes:

NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core


I was able to confirm that this works on my Mac Pro with an ATI Radeon HD 4870 using the following code:

NSOpenGLPixelFormatAttribute attributes [] = {
  NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core, // enable OpenGL 3.2
  NSOpenGLPFADoubleBuffer,	// double buffered
  NSOpenGLPFADepthSize, (NSOpenGLPixelFormatAttribute)16, // 16 bit depth buffer
  (NSOpenGLPixelFormatAttribute)nil
};
return [[[NSOpenGLPixelFormat alloc] initWithAttributes:attributes] autorelease];


Without the Core Profile I get:

GL_VERSION = "2.1 ATI-7.2.9"
GL_SHADING_LANGUAGE_VERSION = "1.20"
GL_RENDERER = "ATI Radeon HD 4870 OpenGL Engine"


and with the Core Profile enabled:

GL_VERSION = "3.2 ATI-7.2.9"
GL_SHADING_LANGUAGE_VERSION = "1.50"
GL_RENDERER = "ATI Radeon HD 4870 OpenGL Engine"


;D

LWJGL should support this as well.

When the Core Profile is requested from LWJGL then somewhere in http://java-game-lib.svn.sourceforge.net/viewvc/java-game-lib/trunk/LWJGL/src/native/macosx/context.m?revision=3598&view=markup should be the two magic NSOpenGLPixelFormatAttributes added.

The only problem I see so far is, that this would only work on Mac OS X 10.7 so there would be an additional #ifdef:

#if MAC_OS_X_VERSION_MAX_ALLOWED >= MAC_OS_X_VERSION_10_7


necessary to compile it correctly.

Would that be possible? Please? Pretty Please?  ;)

spasi

This is unfortunate. They made it a PixelFormat attribute instead of something similar to ARB_create_context. It will require some hackery to add support for this.

Estraven

Hi,

I can't tell you how happy I am to see someone actualy getting an OpenGL 3.2 Core Context enabled on Lion. Thank you Void256 !
I would definitivelly be happier if LWJGL could enable it.  ;D

as Void256 said : "Would that be possible? Please? Pretty Please? "  ;)

Estraven

kappa

Quote from: spasi on July 22, 2011, 23:00:25
This is unfortunate. They made it a PixelFormat attribute instead of something similar to ARB_create_context. It will require some hackery to add support for this.

Yeh agreed, also means that the nightly build server will need to be updated to OS X Lion 10.7 to use the new API's, poor old Endolf just recently finished updating it to 10.6. This also sets a silly trend of requiring new API's for future OpenGL releases which will again require a the nightly server to be updated.

I do wonder though, if the new API's are just constants, then we can just plug the values in without requiring dependencies on new API's.

spasi

Afaict adding the constants would be enough. Sorry that I can't be of more help, I don't have access to (or experience with) a Mac.

Estraven


I've a Mac (nvidia 330M) updated to Lion, but i've never compiled LWJGL.
Is there any doc on how to compile LWJGL natives on MacOS ?

Do any of you would be available to help me through this ?
Otherwise, I can do tests if you compile with the new required constant (I guess you can compile on 10.6, and I'll test on 10.7)

Estraven

kappa

Not difficult to compile on mac, just grab the xcode sdk from the apple site (xcode 3 one is free). Once downloaded and installed, run eclipse, check out LWJGL from svn and run the ant build.xml and it should compile LWJGL for you.

Quote from: Estraven on August 01, 2011, 14:47:49
Do any of you would be available to help me through this ?
Otherwise, I can do tests if you compile with the new required constant (I guess you can compile on 10.6, and I'll test on 10.7)

What would help is if you could get the values of the new constants (NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core) from the *.h header files.

@Spasi - as I understand it opengl 3.2 needs to be called explicitly before you can use it, but since this is a pixelformat, how would users enable it? do we need to add some new mac specific api to LWJGL or something?

Estraven

Ok, I'm just having trouble with xCode (had to reinstall after Lion upgrade, and something went wrong....)

I'll keep you posted.

Estraven

spasi

@kappa: Yes, it has to be explicit. There's only support for the core profile so LWJGL cannot automatically enable it, even if it detected 10.7+.

I think the cleanest option API-wise would be to continue using ContextAttribs. We should require users to use new ContextAttribs(3, 2).withProfileCore(true). When LWJGL sees this on a Mac, it should enable a code path that passes the right value to the PixelFormat query. We should also add checks for any other usage patterns (withDebug, withProfileCompatibility, etc) and throw an exception on Macs. We may allow withForwardCompatible though, the pseudo-fc mode should work fine on a Mac.

Estraven

@Spasi : For what it worth, I agree with what you described.

Concerning the jnilib modification. I finally succeded in recompiling the library.

Has some trouble and I had to modify the ant build.xml for the MacOs native lib (matching the xCode 4.1 libs)
Anyway, I then added this in the choosepixelformat method :

    
    FILE * f = fopen("temp.txt","w");
    fprintf(f,"%d %d",NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core);
    fclose(f);


Sorry for the old & crappy  C code... gives me : 99 12800

I then tried to add :

putAttrib(&attribs, NSOpenGLPFAOpenGLProfile);
	putAttrib(&attribs, NSOpenGLProfileVersion3_2Core);


to the attributes list...

but when i run my project in eclipse using it, i get :

2011-08-01 22:23:07.908 java[5893:10403] invalid pixel format attribute

What's next ? ;)


Estraven

   Hum :

I brutally replaced the choosepixelformat method with this code :

Quote
    NSOpenGLPixelFormatAttribute attributes[] = {
        NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
        NSOpenGLPFADoubleBuffer,
        NSOpenGLPFADepthSize, (NSOpenGLPixelFormatAttribute)16,
        (NSOpenGLPixelFormatAttribute)nil
    };
    NSOpenGLPixelFormat * fmt = [[NSOpenGLPixelFormat alloc] initWithAttributes:attributes];


   if (fmt == nil) {
      throwException(env, "Could not create pixel format");
      return NULL;
   }
   return fmt;

   

GOT THE OPENGL 3.2 CORE CONTEXT =) =)

   

Estraven

Hum.... except that i can't get anything rendered... the GLCanvas get the color defined by glClearColor, but no geometry is rendered....

I'll try to find  a correct pixel format initialisation. If you have any suggestions...

Estraven

Estraven

Very very strange... I have a Open3.2 compliant performance test software.
It's supposed to run at 50 FPS on a 330M (as it does on Windows 7)

Turns out it does run at 60 FPS, and when mouse grabbing (i do some offscreen additionnal rendering) decrease to 45 FPS... (as it does on Windows...)
This suggest that the geometry is processed correctly by the shaders. I just don't see anything on screen....

Shader compilation logs and GPU loading logs are almost indentical (Windows vs MacOs)
I just get an additional warning on Mac when validating shaders (Warning : no vertex array object bound.)

Regarding the context.m file, I ended up to this :

Quote

NSOpenGLPixelFormat *choosePixelFormat(JNIEnv *env, jobject pixel_format, bool use_display_bpp, bool support_window, bool support_pbuffer, bool double_buffered) {
   
   
    int bpp;
    jclass cls_pixel_format = (*env)->GetObjectClass(env, pixel_format);
    //if (use_display_bpp)
    //    bpp = CGDisplayBitsPerPixel(kCGDirectMainDisplay);
    //else
        bpp = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "bpp", "I"));
   
    int alpha = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "alpha", "I"));
    int depth = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "depth", "I"));
    int stencil = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "stencil", "I"));
    int samples = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "samples", "I"));
    int num_aux_buffers = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "num_aux_buffers", "I"));
    int accum_bpp = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "accum_bpp", "I"));
    int accum_alpha = (int)(*env)->GetIntField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "accum_alpha", "I"));
    bool stereo = (bool)(*env)->GetBooleanField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "stereo", "Z"));
    bool floating_point = (bool)(*env)->GetBooleanField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "floating_point", "Z"));
    // TODO: Add floating_point_packed attribute below
       bool floating_point_packed = (bool)(*env)->GetBooleanField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "floating_point_packed", "Z"));
    // TODO: Add sRGB attribute below
       bool sRGB = (bool)(*env)->GetBooleanField(env, pixel_format, (*env)->GetFieldID(env, cls_pixel_format, "sRGB", "Z"));

    attrib_list_t attribs;
    jboolean allow_software_acceleration = getBooleanProperty(env, "org.lwjgl.opengl.Display.allowSoftwareOpenGL");
    initAttribList(&attribs);
   
   putAttrib(&attribs, NSOpenGLPFAOpenGLProfile); // ADDED LINE
    putAttrib(&attribs, NSOpenGLProfileVersion3_2Core); // ADDED LINE


   
    if (support_window) // DISPLACED HERE -
        putAttrib(&attribs, NSOpenGLPFAWindow);
   
   
    if (support_pbuffer)  // DISPLACED HERE -
        putAttrib(&attribs, NSOpenGLPFAPixelBuffer);

   
    if (!allow_software_acceleration)
        putAttrib(&attribs, NSOpenGLPFAAccelerated);
    if (double_buffered)
        putAttrib(&attribs, NSOpenGLPFADoubleBuffer);
    putAttrib(&attribs, NSOpenGLPFAMinimumPolicy);
   
    putAttrib(&attribs, NSOpenGLPFAColorSize); putAttrib(&attribs, bpp);
    putAttrib(&attribs, NSOpenGLPFAAlphaSize); putAttrib(&attribs, alpha);
    putAttrib(&attribs, NSOpenGLPFADepthSize); putAttrib(&attribs, depth);
    putAttrib(&attribs, NSOpenGLPFAStencilSize); putAttrib(&attribs, stencil);
    putAttrib(&attribs, NSOpenGLPFAAccumSize); putAttrib(&attribs, accum_bpp + accum_alpha);
   
    putAttrib(&attribs, NSOpenGLPFASampleBuffers); putAttrib(&attribs, samples > 0 ? 1 : 0);
    putAttrib(&attribs, NSOpenGLPFASamples); putAttrib(&attribs, samples);
    putAttrib(&attribs, NSOpenGLPFAAuxBuffers); putAttrib(&attribs, num_aux_buffers);
   

   
   
    if (stereo)
        putAttrib(&attribs, NSOpenGLPFAStereo);
   
    if (floating_point)
        putAttrib(&attribs, NSOpenGLPFAColorFloat);
   
    putAttrib(&attribs, 0);
   
    NSOpenGLPixelFormat* fmt = [[NSOpenGLPixelFormat alloc] initWithAttributes:(NSOpenGLPixelFormatAttribute *)attribs.attribs];

    if (fmt == nil) {
        throwException(env, "Could not create pixel format");
        return NULL;
    }
    return fmt;
}

Very few modifications.

I can't find any more test to do... Please anyone help me.

Estraven

Estraven

Oups... My Bad. OpenGL 3.2 Core REQUIRES the use of VAOs (not just VBOs...)
I added VAO instructions, now It work perfectly !

Thus, the context.m i posted above works on Lion,

I'll be using this tweeked native libs until you come up with a more flexible release of LWJGL.
Meanwhile, if any other MacOs user wants this lib, i can provide it. I only changed the "liblwjgl.jnilib"

Thanks for your help.

Estraven

kappa

nice work.

Would it be possible that you can get the constant values of both NSOpenGLPFAOpenGLProfile and NSOpenGLProfileVersion3_2Core? These can be found in the *.h file they are from. Just open it and copy and paste the two lines that specify the two. This would allow adding support for those without requiring Lion to compile LWJGL.