[FIXED] OSX retina support?

Started by krausest, July 07, 2013, 18:16:09

Previous topic - Next topic

krausest

Is is possible to use retina resolution with LWJGL? (If not with LWJGL Display, maybe with an AWT or SWT frame?)

Yours,
Stefan

Fool Running

The "retina resolution" is just a high-resolution setting. As long as it's in the available display modes, it shouldn't be a problem.
Programmers will, one day, rule the world... and the world won't notice until its too late.Just testing the marquee option ;D

krausest

Thanks. You're right - it works that way for full screen apps.
Is there also a way when the app runs in a window?

Fool Running

I'm not sure I completely understand your question. The "resolution" of a windowed application is the size of the window. As long as the desktop is at the retinal resolution, then the windowed application could take up most of the desktop when maximized, but will never be at the full resolution.
Programmers will, one day, rule the world... and the world won't notice until its too late.Just testing the marquee option ;D

krausest

The display of my MacBook Pro reports a width of 1440 points in the "best for retina"-display mode, though it actually has twice as many pixel horizontally. If I want to create a window that covers the upper left quarter of the screen it has a dimension of 720 x 450 points. If you use e.g. SWT and draw a line into the window it will be rendered with retina resolution (it uses 4 pixel for each point of the window), for images one has to render an image into a target rectangle with half of the image's width and height to use retina resolution.
Some details about opengl and retina are here: https://developer.apple.com/library/mac/#documentation/GraphicsAnimation/Conceptual/HighResolutionOSX/CapturingScreenContents/CapturingScreenContents.html

krausest

I managed to hack it:
Adding [window_info->view setWantsBestResolutionOpenGLSurface:YES]; in org_lwjgl_opengl_Display.m (line 70 after window_info->view = [[MacOSXOpenGLView alloc] initWithFrame:view_rect pixelFormat:peer_info->pixel_format];) and using twice the width and height for glViewport "retinizes" all opengl rendering.

Is there a simpler way?

Fool Running

I don't think there is support for whatever that extension is. However, you can create the same effect by creating the display at a resolution (say 1440x900) and setting up your viewport for that resolution. When creating your ortho view (with glOrtho), give it the half-resolution (say 720x450). That should create the same effect and not require any extra extensions.
Programmers will, one day, rule the world... and the world won't notice until its too late.Just testing the marquee option ;D

kappa

I've had a look into getting this implemented, high DPI mode can be switched on now using the org.lwjgl.opengl.Display.enableHighDPI switch:

e.g. as a VM parameter using:
-Dorg.lwjgl.opengl.Display.enableHighDPI=true

or in java code before creating the Display:
System.setProperty("org.lwjgl.opengl.Display.enableHighDPI", "true");


However it seems that the way its implemented on OS X may not be usable without at least adding some new LWJGL API or internal modifications to LWJGL. Windows 8 also seems to have taken the same approach with their implementation of high dpi resolutions.

On retina screens, any app that does not explicitly opt into the high dpi mode will be scaled up (currently twice the size as retina apps i.e. 4 pixels represent one scaled up pixel) and will work as normal (as it did on non-retina screens). If you do opt into the high dpi mode then the OpenGL frame buffer uses the retina resolution but the mouse position and window sizes continue to use the scaled up sizes (currently half the size) i.e. the Display.getWidth()/Display.getHeight() will return sizes half that of the frame buffers pixel size. This behaviour is probably intentional to stop apps get smaller at higher resolutions but instead just looking sharper.

Problem is methods such as glViewport, glScissor, glReadPixels, glLineWidth, glRenderbufferStorage, glTexImage2D which often get their pixel information from Display and Mouse can no longer use these without the values first being modified.

Possible solutions:

1) Internally scale the Mouse and Display.getWidth()/getHeight() values to match the real retina resolution and OpenGL frame buffer size, seems easy to do but all apps not adapted for this behaviour will appear tiny (half their normal size) on retina displays.

2) Catch values sent to all pixel dependant OpenGL methods and scale them appropriately, seems ugly to do (with the lwjgl generator) and not sure it'd be workable for stuff like shaders.

3) Push the burden on to developers with some new LWJGL API so they can do the calculations themselves when opting into high dpi mode.
e.g. give them the size of the OpenGL Frame Buffer with Display.getFrameBufferWidth()/getFrameBufferHeight() or something like giving them a Display.getFrameBufferScaleFactor() which will return 2.0f when in retina mode (or whatever the DPI ratio is) and 1.0f when not in retina mode so dev's can do something like
glViewport(0,0,Display.getWidth()*Display.getFrameBufferScaleFactor(), Display.getHeight()*Display.getFrameBufferScaleFactor())
when passing values to pixel dependant opengl methods.

Any idea's on a good direction we can take on this? more info here.

Cornix

I'd say the last option is the cleanest in my opinion. It gives more power to the single developer instead of hiding functionality. In my opinion the worst thing an API can do is do some magic tricks behind the developers back.
On the other hand, it is probably going to become more complicated for beginners now.

kappa

Added a new LWJGL API, Display.getPixelScaleFactor().

JavaDoc for the method:
Quote/**
        * @return this method will return the pixel scale factor of the Display window.
        *
        * This method should be used when running in high DPI mode. In such modes Operating
        * Systems will scale the Display window to avoid the window shrinking due to high
        * resolutions. The OpenGL frame buffer will however use the higher resolution and
        * not be scaled to match the Display window size.
        *
        * OpenGL methods that require pixel dependent values e.g. glViewport, glTexImage2D,
        * glReadPixels, glScissor, glLineWidth, glRenderbufferStorage, etc can convert the
        * scaled Display and Mouse coordinates to the correct high resolution value by
        * multiplying them by the pixel scale factor.
        *
        * e.g. Display.getWidth() * Display.getPixelScaleFactor() will return the high DPI
        * width of the OpenGL frame buffer. Whereas Display.getWidth() will be the same as
        * the OpenGL frame buffer in non high DPI mode.
        *
        * Where high DPI mode is not available this method will just return 1.0f therefore
        * not have any effect on values that are multiplied by it.
        */

example usage:
glViewport(0,0,Display.getWidth()*Display.getPixelScaleFactor(), Display.getHeight()*Display.getPixelScaleFactor())


remember to opt into the high dpi mode with the org.lwjgl.opengl.Display.enableHighDPI switch.

Quote from: Cornix on November 11, 2013, 07:12:16
On the other hand, it is probably going to become more complicated for beginners now.
You only need to use the API if you enable high DPI mode otherwise everything will work as normal.

That should wrap up support for retina resolutions on OS X, same API can probably be used if we decided to implement high DPI support on Windows (which became available from Windows 8 ). However high DPI monitors haven't become mainstream on the PC yet and are pretty rare.

Above can be tested in the latest nightly builds of LWJGL 2.9.1.

Jibsone

Thanks. You're right - it works that way for full screen apps. check keto slim