eglCreatePBufferSurface returns EGL_BAD_CONFIG

Started by kyasbal, October 16, 2019, 09:33:30

Previous topic - Next topic

kyasbal

Hi, I'm trying to use EGL to render several images on a headless server.
I initialized EGL egl surface to create eglContext with the following code.  (This code is written by Kotlin)

    
    val display = eglGetDisplay(EGL_DEFAULT_DISPLAY)
    if(display == EGL_NO_DISPLAY){
        error("Failed to eglGetDisplay")
    }
    val major = stack.mallocInt(1)
    val minor = stack.mallocInt(1)
    val eglStatus = eglInitialize(display,major,minor)
    if(!eglStatus){
        error("Failed to eglInitialize")
    }

    val eglConfigs = stack.ints(
        EGL_SURFACE_TYPE, EGL_PBUFFER_BIT,
        EGL_BLUE_SIZE, 8,
        EGL_GREEN_SIZE, 8,
        EGL_RED_SIZE, 8,
        EGL_DEPTH_SIZE, 8,
        EGL_RENDERABLE_TYPE, EGL_OPENGL_BIT,
        EGL_NONE)
    val pBufferAttribs = stack.ints(EGL_WIDTH,8, EGL_HEIGHT,8, EGL_NONE)
    val numConfig = stack.mallocInt(1)
    check(
        eglChooseConfig(
            display,
            eglConfigs,
            null,
            numConfig
        )
    ) { String.format("eglChooseConfig failed [0x%X]", eglGetError()) }

    val configs = stack.mallocPointer(numConfig[0])
    val numConfig2 = stack.mallocInt(1)
    check(eglChooseConfig(display, eglConfigs, configs, numConfig2)){
        error("Failed to choose config with EGL")
    }
    val eglSurf = eglCreatePbufferSurface(display,configs.address(),pBufferAttribs)
    if(eglSurf == EGL_NO_SURFACE){   
val display = eglGetDisplay(EGL_DEFAULT_DISPLAY)
    if(display == EGL_NO_DISPLAY){
        error("Failed to eglGetDisplay")
    }
    val major = stack.mallocInt(1)
    val minor = stack.mallocInt(1)
    val eglStatus = eglInitialize(display,major,minor)
    if(!eglStatus){
        error("Failed to eglInitialize")
    }

    val eglConfigs = stack.ints(
        EGL_SURFACE_TYPE, EGL_PBUFFER_BIT,
        EGL_BLUE_SIZE, 8,
        EGL_GREEN_SIZE, 8,
        EGL_RED_SIZE, 8,
        EGL_DEPTH_SIZE, 8,
        EGL_RENDERABLE_TYPE, EGL_OPENGL_BIT,
        EGL_NONE)
    val pBufferAttribs = stack.ints(EGL_WIDTH,8, EGL_HEIGHT,8, EGL_NONE)
    val numConfig = stack.mallocInt(1)
    if(!eglChooseConfig(
            display,
            eglConfigs,
            null,
            numConfig
        )
    ) { String.format("eglChooseConfig failed [0x%X]", eglGetError()) }

    val configs = stack.mallocPointer(numConfig[0])
    val numConfig2 = stack.mallocInt(1)
    if(!eglChooseConfig(display, eglConfigs, configs, numConfig2)){
        error("Failed to choose config with EGL")
    }
    val eglSurf = eglCreatePbufferSurface(display,configs.address(),pBufferAttribs)
    if(eglSurf == EGL_NO_SURFACE){
        error("Failed to initialize surface" + eglGetError())
    }


Even each eglChooseConfig returns true, eglCreatePbufferSurface returns EGL_NO_SURFACE with the error of EGL_BAD_CONFIG.
I wonder that should mean the returned egl config from eglChooseConfig is wrong.  How can I correct my code to use LWJGL?