Please login or register.

Login with username, password and session length

Author Topic: [Solved] Doesn't render anything with no errors on Arch Linux  (Read 128 times)


  • Newbie
  • *
  • Offline Offline
  • Posts: 5

So i moved lately from win10 to arch and moved my projects and stuff.
And on arch my app simply shows glfw window with background from glClearColor and thats it.
To test what i might have broken, i copied and added simple quad to basic example from lwjgl.org page(because my thing is big with wrappers, shaders and also not in java but in kotlin, so i wanted to find the source of the problem in much simpler example) and it also didn't work.

Here it is:

Code: (Test.java) [Select]
import org.lwjgl.Version;
import org.lwjgl.glfw.GLFWErrorCallback;
import org.lwjgl.glfw.GLFWVidMode;
import org.lwjgl.opengl.GL;
import org.lwjgl.system.MemoryStack;

import java.nio.IntBuffer;

import static org.lwjgl.glfw.Callbacks.glfwFreeCallbacks;
import static org.lwjgl.glfw.GLFW.*;
import static org.lwjgl.opengl.GL11.*;
import static org.lwjgl.system.MemoryStack.stackPush;
import static org.lwjgl.system.MemoryUtil.NULL;

public class Test {

    // The window handle
    private long window;

    public void run() {
        System.out.println("Hello LWJGL " + Version.getVersion() + "!");


        // Free the window callbacks and destroy the window

        // Terminate GLFW and free the error callback

    private void init() {
        // Setup an error callback. The default implementation
        // will print the error message in System.err.

        // Initialize GLFW. Most GLFW functions will not work before doing this.
        if ( !glfwInit() )
            throw new IllegalStateException("Unable to initialize GLFW");

        // Configure GLFW
        glfwDefaultWindowHints(); // optional, the current window hints are already the default
        glfwWindowHint(GLFW_VISIBLE, GLFW_FALSE); // the window will stay hidden after creation
        glfwWindowHint(GLFW_RESIZABLE, GLFW_TRUE); // the window will be resizable

        // Create the window
        window = glfwCreateWindow(300, 300, "Hello World!", NULL, NULL);
        if ( window == NULL )
            throw new RuntimeException("Failed to create the GLFW window");

        // Setup a key callback. It will be called every time a key is pressed, repeated or released.
        glfwSetKeyCallback(window, (window, key, scancode, action, mods) -> {
            if ( key == GLFW_KEY_ESCAPE && action == GLFW_RELEASE )
                glfwSetWindowShouldClose(window, true); // We will detect this in the rendering loop

        // Get the thread stack and push a new frame
        try ( MemoryStack stack = stackPush() ) {
            IntBuffer pWidth = stack.mallocInt(1); // int*
            IntBuffer pHeight = stack.mallocInt(1); // int*

            // Get the window size passed to glfwCreateWindow
            glfwGetWindowSize(window, pWidth, pHeight);

            // Get the resolution of the primary monitor
            GLFWVidMode vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor());

            // Center the window
                (vidmode.width() - pWidth.get(0)) / 2,
                (vidmode.height() - pHeight.get(0)) / 2
        } // the stack frame is popped automatically

        // Make the OpenGL context current
        // Enable v-sync

        // Make the window visible

    private void loop() {
        // This line is critical for LWJGL's interoperation with GLFW's
        // OpenGL context, or any context that is managed externally.
        // LWJGL detects the context that is current in the current thread,
        // creates the GLCapabilities instance and makes the OpenGL
        // bindings available for use.

        glViewport(0, 0, 300, 300);

        // Set the clear color
        glClearColor(0.0F, 0.0F, 0.0F, 0.0F);

        // Run the rendering loop until the user has attempted to close
        // the window or has pressed the ESCAPE key.
        while ( !glfwWindowShouldClose(window) ) {
            glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // clear the framebuffer

            glColor3d(0.0, 0.0, 1.0);

            glVertex2d(0.0, 0.0);
            glVertex2d(0.5, 0.0);
            glVertex2d(0.5, 0.5);
            glVertex2d(0.0, 0.5);

            glfwSwapBuffers(window); // swap the color buffers

            // Poll for window events. The key callback above will only be
            // invoked during this call.

    public static void main(String[] args) {
        new Test().run();


And the log output with -Dorg.lwjgl.util.Debug=true shows no errors or anything:

Code: (log output) [Select]
Hello LWJGL 3.1.2 build 29!
[LWJGL] Version: 3.1.2 build 29
[LWJGL] OS: Linux v4.11.4-1-ARCH
[LWJGL] JRE: 1.8.0_131 amd64
[LWJGL] JVM: OpenJDK 64-Bit Server VM v25.131-b11 by Oracle Corporation
[LWJGL] Loading library (system): lwjgl
[LWJGL] Found at: /tmp/lwjglnecauqua/3.1.2-build-29/liblwjgl.so
[LWJGL] Loaded from org.lwjgl.librarypath: /tmp/lwjglnecauqua/3.1.2-build-29/liblwjgl.so
[LWJGL] MemoryUtil accessor: MemoryAccessorUnsafe
[LWJGL] Loading library: jemalloc
[LWJGL] Found at: /tmp/lwjglnecauqua/3.1.2-build-29/libjemalloc.so
[LWJGL] Loaded from org.lwjgl.librarypath: /tmp/lwjglnecauqua/3.1.2-build-29/libjemalloc.so
[LWJGL] MemoryUtil allocator: JEmallocAllocator
[LWJGL] Loading library: glfw
[LWJGL] Found at: /tmp/lwjglnecauqua/3.1.2-build-29/libglfw.so
[LWJGL] Loaded from org.lwjgl.librarypath: /tmp/lwjglnecauqua/3.1.2-build-29/libglfw.so
[LWJGL] Loading library (system): lwjgl_opengl
[LWJGL] Found at: /tmp/lwjglnecauqua/3.1.2-build-29/liblwjgl_opengl.so
[LWJGL] Loaded from org.lwjgl.librarypath: /tmp/lwjglnecauqua/3.1.2-build-29/liblwjgl_opengl.so
[LWJGL] Loading library: libGL.so.1
[LWJGL] libGL.so.1 not found in org.lwjgl.librarypath=/tmp/lwjglnecauqua/3.1.2-build-29
[LWJGL] Loaded from system paths

So, what might i have done wrong or why it is not rendering anything? Please help.

Turns out that it for some reason didn't like me reusing the same array to upload uniform matrices(and it completely broke projection/transformation of my vertices). So i switched it to reusing the same float buffer, which is actually more accurate anyway.

Also yeah, in above example i simply forgot to call glBegin/glEnd, idiot :)


  • Nerdus Imperius
  • *****
  • Offline Offline
  • Posts: 903
Re: Doesn't render anything with no errors on Arch Linux
« Reply #1 on: June 19, 2017, 13:22:26 »

The thing is, when people say "with no errors" they usually _only_ refer to:
1. no compile errors (of course)
2. no Java exceptions being thrown during runtime

However, this is not sufficient for saying "with no errors" in an OpenGL application.
Because OpenGL generates errors differently. And you have to either explicitly query for an error when you think an error might have occurred (GL11.glGetError()) or you opt-in for getting error/debug callback notifications. If you do not do that, you will likely just get a black screen when in fact there are a myriad of errors happening in the background in OpenGL which you will just never notice.

So, the best option would be to do this:
Code: [Select]
// Before glfwCreateWindow():

// After glfwMakeContextCurrent() and createCapabilities():
Callback debugProc = GLUtil.setupDebugMessageCallback();

Then you will notice errors popping up even with your posted code, because you are calling immediate mode vertex specification methods without being in a glBegin()/glEnd().


  • Newbie
  • *
  • Offline Offline
  • Posts: 5
Re: Doesn't render anything with no errors on Arch Linux
« Reply #2 on: June 19, 2017, 14:05:02 »

Aha. Thanks, now i will know.
Completely forgot how to do immediate mode lol. Now simple example works, thanks again.
I've even managed to get VAO somehow work in simple example. However, main application still displays black screen with no errors even with debug callback. I dont know whats wrong again and to ask i need to cut and reformat lots of stuff(also still kotlin) so i will investigate further.
Also some errors i do fetch and print if any, like shader compilation, framebuffer status and so on.


  • Newbie
  • *
  • Offline Offline
  • Posts: 5
Re: Doesn't render anything with no errors on Arch Linux
« Reply #3 on: June 19, 2017, 14:19:24 »

So, in my sprite vertex shader i have this line of code:
Code: [Select]
gl_Position = vec4((projection * transform * vec3(rect_vertices[gl_VertexID], 1)).xy, 0, 1);
where projection and transform are uniform 3x3 matrices.
If i replace it with this code:
Code: [Select]
gl_Position = vec4(rect_vertices[gl_VertexID], 0, 1);
Then it renders a texture with given uv's at a quater of a screen(as you would expect).
Now i have to find whats wrong, since on windows it worked fine :)


  • Nerdus Imperius
  • *****
  • Offline Offline
  • Posts: 450
Re: Doesn't render anything with no errors on Arch Linux
« Reply #4 on: June 19, 2017, 22:40:02 »

Its probably not a windows vs linux thing bur rather different driver implementations. Some drivers are more lenient and allow code which is not 100% formally correct. I've ran into this trouble before. Its a mess.


  • Newbie
  • *
  • Offline Offline
  • Posts: 5
Re: Doesn't render anything with no errors on Arch Linux
« Reply #5 on: June 19, 2017, 22:46:13 »

Finally found out, why my matrix uniforms weren't uploading properly.
Turns out, on linux it for some reason didn't like me reusing same array to upload the matrix.
When i tuned it to create new one everytime it worked.
So i switched it to float buffer and finally everything worked as it should.


  • Newbie
  • *
  • Offline Offline
  • Posts: 5
Re: Doesn't render anything with no errors on Arch Linux
« Reply #6 on: June 19, 2017, 22:47:56 »

Also yeah, first time it was not compiling the shader because linux driver didn't like C-style array definitions :)