Losing Drawing surface when component is hidden

Started by Aisaaax, February 25, 2020, 03:54:37

Previous topic - Next topic


Due to the AWT/Swing nature of the project I'm working on, I have to render directly onto AWT element, in my case - Canvas. Here's the Init() method of my module, as well as some other methods that give idea on how I handle rendering.

    public Component canvas;
    private JAWTDrawingSurface ds;
    public long display;
    private long eglDisplay;
    public long drawable;
    private long surface;

    public long context = 0;

    public static final JAWT awt;
    static {
        awt = JAWT.calloc();
        if (!JAWT_GetAWT(awt))
            throw new AssertionError("GetAWT failed");

    synchronized public void lock() throws AWTException {
        int lock = JAWT_DrawingSurface_Lock(ds, ds.Lock());
        if ((lock & JAWT_LOCK_ERROR) != 0)
            throw new AWTException("JAWT_DrawingSurface_Lock() failed");
        locked = true;

    synchronized public void unlock() throws AWTException {
        JAWT_DrawingSurface_Unlock(ds, ds.Unlock());
        locked = false;

    synchronized public void Init2()
        System.err.println("Window init2() started");
        boolean b = canvas.isVisible();

        this.ds = JAWT_GetDrawingSurface(canvas, awt.GetDrawingSurface());
                JAWTDrawingSurfaceInfo dsi = JAWT_DrawingSurface_GetDrawingSurfaceInfo(ds, ds.GetDrawingSurfaceInfo());

                JAWTX11DrawingSurfaceInfo dsiWin = JAWTX11DrawingSurfaceInfo.create(dsi.platformInfo());

                int depth = dsiWin.depth();
                this.display = dsiWin.display();
                this.drawable = dsiWin.drawable();

                System.err.printf("EGL Display %d, drawable: \n", display, drawable);

                eglDisplay = eglGetDisplay(display);

                EGLCapabilities egl;
                try (MemoryStack stack = stackPush()) {
                    IntBuffer major = stack.mallocInt(1);
                    IntBuffer minor = stack.mallocInt(1);

                    if (!eglInitialize(eglDisplay, major, minor)) {
                        throw new IllegalStateException(String.format("Failed to initialize EGL [0x%X]", eglGetError()));

                    egl = EGL.createDisplayCapabilities(eglDisplay, major.get(0), minor.get(0));
                System.err.println("EGL caps created");

                IntBuffer attrib_list = BufferUtils.createIntBuffer(16);

                PointerBuffer fbConfigs = BufferUtils.createPointerBuffer(10);
                IntBuffer numConfigs = BufferUtils.createIntBuffer(1);

                eglChooseConfig(eglDisplay, attrib_list, fbConfigs,numConfigs);

                if (fbConfigs == null || fbConfigs.capacity() == 0) {
                    // No framebuffer configurations supported!
                    System.err.println("No supported framebuffer configurations found");
                    int num = numConfigs.get();
                    System.err.printf("Number of found FrameBuffer Configs: %d \n", num);
                    for(int i = 0; i<num; i++)
                        int[] val = new int[1];
                        System.err.printf("Config %d Depth Size: %d \n", i+1, val[0]);

                IntBuffer context_attrib_list = BufferUtils.createIntBuffer(18);

                context = eglCreateContext(eglDisplay,fbConfigs.get(0),EGL_NO_CONTEXT,context_attrib_list);

                int err = eglGetError();

                surface = eglCreateWindowSurface(eglDisplay,fbConfigs.get(0),drawable,(int[])null);

                int err1 = eglGetError();


                int err2 = eglGetError();

                GLESCapabilities gles = GLES.createCapabilities();
                System.err.println("CLES caps created");

                JAWT_DrawingSurface_FreeDrawingSurfaceInfo(dsi, ds.FreeDrawingSurfaceInfo());
                System.err.printf("Unlock \n");


        catch (Exception e)
            System.err.println("JAWT Failed" + e.getMessage());

        // Render with OpenGL ES

        int[] range = new int[2];
        int[] precision = new int[2];
        glGetShaderPrecisionFormat(GL_FRAGMENT_SHADER, GL_HIGH_FLOAT, range, precision);

        System.err.println("Window context Initialized");

    public void update()
        if (! paused && canvas.isVisible())
                int a = 0;
                eglSwapBuffers(eglDisplay, surface);
            } catch (Exception e) {


My problem: It works fine the first time, when I create new canvas and then initialize OpenGL with it in mind.
But if for any reason canvas is hidden - either set to not visible, or the window is minimized - then I get errors.

Basically, any attempt to use ds (for example in lock()) results in exception being thrown.

I tried to pause the 3d loop while canvas is hidden. Basically tell it to not update() the window, as it's the only place where such calls happen. And then resume as soon as the canvas is visible again.
But that doesn't work. Despite the fact that the context seems to be logically working (I can add and update meshes, etc), but the call to lock() still throws an exception.
The ds is there - the same one as it was before - but for some reason, it can't be worked with anymore.

So, to render again, I have to re-create DS. And then re-create context. And that means re-making all the objects on the scene anew. before I can render them. I tried going without lock()-unlock() in the update method. Which is not ideal because it can cause exceptions in multithreaded apps. But even simply eglSwapBuffers(eglDisplay, surface); fails.
Interestingly, if I check eglGetError() after swapping buffers - it returns 1288, i.e. that everything's fine. But it runs like twice after resuming, and then the thread just STOPS. Without any exceptions or errors.

How do I get around that? What am I missing or doing wrong?


Hard to say what might be wrong. Please see the JAWT demo, does it work better if you port it to use EGL/GLES? One detail that might be important (but I don't remember why it does this), is resetting the current context after rendering. See this line for example.


So I checked it out, and noticed that while they re-create the window surface and resolve the display dynamically, they keep the old context. I tried that and it works! What strikes me as strange is that they do it on every paint() call, while in reality you only need it when the canvas was hidden. Or perhaps it's just an over-vigilance on their part, just in case something unexpected happens that isn't detected by the app