Hello Guest

Recent Posts

Pages: [1] 2 3 ... 10
1
OpenGL / Re: Lwjgl and Mac Mojave
« Last post by nbilyk on November 13, 2018, 22:52:17 »
Ah thank you!  Is there a way to do one version for opengles and one version for desktop?  (That part I can figure out on my own now that I have the right direction)
2
OpenGL / Re: Lwjgl and Mac Mojave
« Last post by KaiHH on November 13, 2018, 21:03:57 »
There has never been an actual GLSL #version 100 in OpenGL for Desktop. What you are referring to with WebGL is GLSL for OpenGL for Embedded Systems (OpenGL ES) which WebGL derives off of, which uses #version 100 for OpenGL ES 2.0. The first GLSL version in OpenGL for Desktop was introduced as #version 110 with OpenGL 2.0.
3
OpenGL / Lwjgl and Mac Mojave
« Last post by nbilyk on November 13, 2018, 20:59:20 »
I'm using Lwjgl, and after updating to Mojave, it seems that the glsl is no longer working with #version 100.

In my logs I get "Supported GLSL language version: 1.20"
It's a new laptop, it worked before Mojave, and the same shaders compile in webgl, so I'm wondering if anybody else has hit this, and if it might be part of their deprecation of opengl?

-Nick
4
OpenGL / Re: GLFW Multi-Monitor Setup
« Last post by spasi on November 11, 2018, 20:40:29 »
Have a look at the following GLFW APIs:

- glfwSetFramebufferSizeCallback (the framebuffer size can be different than the window size on HiDPI monitors)
- glfwGetMonitorContentScale/glfwGetWindowContentScale
- The new GLFW_SCALE_TO_MONITOR hint in LWJGL 3.2.1 snapshots. This fixes a lot of issues and has much better behavior when a window crosses monitors with different DPI settings.

I would also recommend separate testing for Windows/Linux and macOS, the latter handles HiDPI differently (see GLFW_COCOA_RETINA_FRAMEBUFFER). Finally, make sure to run your application on Java 9 or newer on Windows. Java 8 binaries are not configured with per-monitor DPI awareness.
5
OpenGL / GLFW Multi-Monitor Setup
« Last post by ShadowDragon on November 10, 2018, 00:00:05 »
Hi there,

I'm wondering how one can take care of multiply monitors in LWJGL 3 while also making sure the resolution and  DPI settings are correct
as I've got a 24" FHD monitor and a 15" WQHD monitor and want to be able to launch the application on a specified monitor depending
whether it is connected or not, but also making sure everything looks the same size.

Note: It should work regardless of fullscreen mode or not (or what is left of fullscreen mode in Win10 1803 :P )

That's the code I've got in place currently:
Code: (java) [Select]
        // Create the window
        this.window = glfwCreateWindow(this.width, this.height, this.title, NULL, NULL);
        if(this.window == NULL) {
            glfwTerminate();
            glfwSetErrorCallback(null).free();
            throw new RuntimeException("Failed to create the GLFW window");
        }
       
        // Get the thread stack and push a new frame
        try(MemoryStack stack = stackPush()) { // thread-local lookup
            IntBuffer pWidth = stack.mallocInt(1); // int* ; ip instance eliminated
            IntBuffer pHeight = stack.mallocInt(1); // int* ; ip instance eliminated
           
            // Get the window size passed to glfwCreateWindow
            glfwGetWindowSize(this.window, pWidth, pHeight);
           
            // Get the resolution of the primary monitor
            GLFWVidMode vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor());
           
            // Center the window
            glfwSetWindowPos(this.window,
                    (vidmode.width() - pWidth.get(0)) / 2,
                    (vidmode.height() - pHeight.get(0)) / 2
                );
        } // the stack frame is popped automatically
       
        // Make the OpenGL context current
        glfwMakeContextCurrent(this.window);
        GL.createCapabilities();
        glfwShowWindow(this.window);
I don't know if this is relevant to this question, but I'm using OpenGL 4.6.

Thx in advance.
ShadowDragon
6
OpenGL / Re: Improving Performance of Rendering 2D Tiled Terrain
« Last post by Setlock on November 07, 2018, 15:30:32 »
frequently = all thousands of objects 60times per second? :) Note that, when KAIHH writes that calling glDraw* is expensive, it also means calling gl* is expensive. When you update data in the gpu's memory, it's expensive.

Why do you update the transform in every frame? Also, why do you even create a new one in every frame? Once data in the GPU's memory, it should only be updated when it was really changed. Here is my rendering logic. Note that, updateBatchDataInGPU gets called ONLY if anything was changed.

https://gist.github.com/mudlee/94f2bd3bed5e1f1234a8dccf1a962c38

Well I update the transform for each object every frame since objects are frequently moving on screen.

Well actually I only call a gl function once per texture and there is only really two textures(which will be lowered by using a texture atlas in the future) for each object I edit a float array then change the data in the vao after editing that float array for all objects. So I only call a gl* function once per texture
7
OpenGL / Re: Improving Performance of Rendering 2D Tiled Terrain
« Last post by mudlee on November 07, 2018, 09:43:01 »
frequently = all thousands of objects 60times per second? :) Note that, when KAIHH writes that calling glDraw* is expensive, it also means calling gl* is expensive. When you update data in the gpu's memory, it's expensive.

Why do you update the transform in every frame? Also, why do you even create a new one in every frame? Once data in the GPU's memory, it should only be updated when it was really changed. Here is my rendering logic. Note that, updateBatchDataInGPU gets called ONLY if anything was changed.

https://gist.github.com/mudlee/94f2bd3bed5e1f1234a8dccf1a962c38

Well I update the transform for each object every frame since objects are frequently moving on screen.
8
OpenGL / Re: Improving Performance of Rendering 2D Tiled Terrain
« Last post by Setlock on November 06, 2018, 20:11:01 »
Why do you update the transform in every frame? Also, why do you even create a new one in every frame? Once data in the GPU's memory, it should only be updated when it was really changed. Here is my rendering logic. Note that, updateBatchDataInGPU gets called ONLY if anything was changed.

https://gist.github.com/mudlee/94f2bd3bed5e1f1234a8dccf1a962c38

Well I update the transform for each object every frame since objects are frequently moving on screen.
9
OpenGL / Re: Improving Performance of Rendering 2D Tiled Terrain
« Last post by mudlee on November 06, 2018, 16:56:28 »
Why do you update the transform in every frame? Also, why do you even create a new one in every frame? Once data in the GPU's memory, it should only be updated when it was really changed. Here is my rendering logic. Note that, updateBatchDataInGPU gets called ONLY if anything was changed.

https://gist.github.com/mudlee/94f2bd3bed5e1f1234a8dccf1a962c38
10
Lightweight Java Gaming Library / Re: glEnableVertexAttribArray black background
« Last post by begin on November 06, 2018, 16:54:08 »
Thanks, it helped me.
Sorry for the big wait.
I followed your advice and am already studying the library at the root.
Pages: [1] 2 3 ... 10