I highly suspect this is a bug, and I also expect it to be a bug with glfw. I have an issue there as well: https://github.com/glfw/glfw/issues/1800
However, doesn't hurt to ask around here as well goes my thinking.
I'm rendering a bunch of stuff to a FBO and when I'm done I'm blitting to the default FB.
This works smooth, except for some mac users. Behavior seems a bit erratic.
The problem is I'm not getting the correct framebuffer size glfwGetFramebufferSize. They are doubled.
glBlitFramebuffer(0, 0, offlineW, offlineH, 0, 0, defaultFBW, defaultFBH, GL_COLOR_BUFFER_BIT, GL_NEAREST);
where defaultFB is the values from glfwGetFramebufferSize.
On my IMac listed as (Retina 4K, 21.5-inch, 2017), the display is stated to be 4096 x2304.
glfwGetVideoModes(glfwGetPrimaryMonitor()) returns resolutions from 640x480 -> 4096x2304
If I launch a windowed window with any resolution, I get the same size framebuffer. The blit is working as expected.
If I launch a full screen window, glfwGetFramebufferSize gives me double the numbers. I.e. window = 4096x2304, FB = 8192x4605, window = 2304x1296, FB = 4096x2304, etc. The result of the blit is that the lower-left quadrant of the offline FBO on covers the whole screen, as expected if the FB size were doubled.
I can solve this by: glfwWindowHint(GLFW_COCOA_RETINA_FRAMEBUFFER, GL11.GL_FALSE);
But this sadly doesn't solve it for some other mac users.
I can also solve it by dividing the returned framebuffer by 2, or divide it by glfwGetWidnowSize, but again it doesn't work on all macs. In fact, it will have the reverse effect on some.
So, I'm thinking, as a layman...
Should glfwGetVideoModes(glfwGetPrimaryMonitor()) really return displaymodes > 2304x1296 for me?
Maybe they should, but I'm still getting too big framebuffers sizes. The x2 factor is consistent on every full screen resolution I'm picking.
I'm also interested if there is a potential workaround. I've tried to figure out how to get the framebuffer myself via opengl, but there doesn't seem to be any method out there.