Display.getAvailableDisplayModes() sporatic on Ubuntu?

Started by Shad, May 10, 2013, 20:06:37

Previous topic - Next topic

Shad

Apologies for any of you who watch StackOverflow.com, since I've posted this question there as well (and gotten little response).  http://stackoverflow.com/questions/16452784/java-and-libgdx-lwjgl-game-fullscreen-wrong-size-for-multiple-monitors-on-ubun.

I've searched through this forum for similar issues, but didn't find any that really fit what I'm seeing.

On Ubuntu 12.04 LTS with OpenJDK6, on a desktop, with two displays (1920x1080 and 1280x1024) with an nVidia GTS 450 video card, libGDX's Gdx.graphics.getDisplayModes() call into lwjgl's Display.getAvailableDisplayModes() returns what I thought was supposed to be a complete list of available graphic modes, as reported by the JRE's hardware interface (or other native code).

Instead of getting a consistent list, I find that from one execution to the next of my game.jar file, my preferred resolution is sometimes present, sometimes not.   Interestingly, I always get 43 modes back from the call, but the modes themselves are different.

On repeated run cycles (running the game.jar file, closing the game, then doing it all again) on the same box, 1920x1080 may be in the list of modes, even multiple times with different refresh rates, or it may not be present at all...

What I'm after is to get back a consistently good resolution/refresh rate to go full-screen with.  


How I got here:

My original code, which worked fine full-screen on a Windows 7 box with two monitors, and an Ubuntu laptop with single monitor, choked on the above-mentioned Ubuntu box with two monitors.  As I mentioned in the StackOverflow posting, when I would start my app with the following code (libGdx calls):
   LwjglApplicationConfiguration cfg = new LwjglApplicationConfiguration();
    Dimension screenDimension = Toolkit.getDefaultToolkit().getScreenSize();
    cfg.width = screenDimension.width;
    cfg.height = screenDimension.height;
    cfg.fullscreen = true;
    new LwjglApplication(new Game1(), cfg);


this results in a width/height of 3200x1080 being selected (both monitors added together), but then the game goes full-screen in only the 1920x1080 screen, so it's got a scaling problem... libGdx thinks it's dealing with 3200x1080, but it's only got 1920x1080 of real-estate to work with.

So, that's why I started looking into how to explicitly select an appropriate display mode, and when I discovered the inconsistency of the available display modes returned by Display.getAvailableDisplayModes() (via libGDX's wrapper call).

Any help appreciated.  If I'm going about this all the wrong way, please enlighten me!

Thanks,
Shad

Cornix

I once had a similar problem on my windows 7 machine.
I went to my graphics cards properties and reset all settings to default and the problem was gone forever.
Might it be, that you changed something in your settings which could be the cause for this?

Shad

That's a thought, it's a fairly new install, I haven't changed any settings.

Same problem happens on a friend's similarly configured Ubuntu machine (same video card, but he has two 1920x1080 monitors) when he tries the game.  I would suspect the nVidia driver, but suspect there are enough users of it that it would've popped up as a problem before (though maybe not, since it's on linux...)
Thanks for the response.

Shad

*bump*
I'm just trying to figure out if I'm doing something wrong, of if I'm fighting an environment problem.

So, can someone at least let me know that my expectation that the call to get the display modes should always return the same list, between successive runs on the same hardware (i.e. no changes to system between runs) is valid?

Is anyone else using Ubuntu with an nVidia GPU running two monitors?  I'd love to hear if you have these issues when running full-screen lwjgl apps (or, even better, full-screen libGDX apps)?

Thanks,
Shad