Hello Guest

Recent Posts

Pages: [1] 2 3 ... 10
1
OpenGL / Re: Shader fails to validate on MacOS with no error log
« Last post by KaiHH on January 25, 2020, 00:06:23 »
Well, according to the specification of glValidateProgram:
Quote
This function mimics the validation operation that OpenGL implementations must perform when rendering commands are issued while programmable shaders are part of current state.
So, the validation being performed is essentially the same as the one when a glDraw... call is being invoked. So, in your case, ensure that at the point where you call glValidateProgram you could also have made a draw call with the current state of the pipeline, and that draw call would have succeeded. This in particular means:
- the texture unit of the sampler in your fragment shader is bound to a valid texture object
- a valid Vertex Array Object is currently bound
- all of the three vertex attributes are enabled and bound to a vertex source (such as a buffer object)
2
OpenGL / Re: Shader fails to validate on MacOS with no error log
« Last post by Danjb on January 24, 2020, 22:19:45 »
Thanks for the reply, but I just tried that and there was no improvement. Any other ideas?
3
OpenGL / Re: Shader fails to validate on MacOS with no error log
« Last post by KaiHH on January 24, 2020, 14:13:28 »
Try using "#version 330 core" as the GLSL version in both shaders. Mac OS only supports the core profile and not the compatibility profile.
4
OpenGL / Shader fails to validate on MacOS with no error log
« Last post by Danjb on January 24, 2020, 09:59:01 »
Hi,

I have a very simple vertex / fragment shader:

Code: [Select]
#version 330

uniform mat4 view_proj_matrix;

layout(location = 0) in vec2 in_vertex;
layout(location = 1) in vec2 in_tex_coord;
layout(location = 2) in vec4 in_colour;

out Data {
    vec4 colour;
    vec2 tex_coord;
} DataOut;

void main(void) {
    gl_Position = view_proj_matrix * vec4(in_vertex, 0, 1);
   
    DataOut.colour = in_colour;
    DataOut.tex_coord = in_tex_coord;
}

Code: [Select]
#version 330

uniform sampler2D tex;

uniform bool use_override_colour;
uniform vec3 override_colour;

in Data {
    vec4 colour;
    vec2 tex_coord;
} DataIn;

out vec4 frag_colour;

void main(void) {
    vec4 tex_colour = texture(tex, DataIn.tex_coord);
    if (use_override_colour) {
        frag_colour = vec4(override_colour.r, override_colour.g, override_colour.b, tex_colour.a);
    } else {
        frag_colour = DataIn.colour * tex_colour;
    }
}

On Windows all is fine, but on MacOS I get an error after validating the program, with an empty error log, e.g. "Error validating shader program: ".

Code: [Select]
        public Builder linkAndValidate() throws ShaderException {

            glLinkProgram(programId);
            int success = glGetProgrami(programId, GL_LINK_STATUS);
            if (success != GL_TRUE) {
                glDeleteProgram(programId);
                throw new ShaderException("Error linking shader program: "
                        + glGetProgramInfoLog(programId));
            }

            // Now that we have our program, we can safely delete the shader
            // objects
            glDeleteShader(vsId);
            glDeleteShader(fsId);

            glValidateProgram(programId);
            success = glGetProgrami(programId, GL_VALIDATE_STATUS);
            if (success != GL_TRUE) {
                glDeleteProgram(programId);
                throw new ShaderException("Error validating shader program: "
                        + glGetProgramInfoLog(programId));
            }

            return this;
        }

I am not sure how to debug this further given I have no information about why it failed. Does anyone have any suggestions?

Thanks,
Dan
5
OpenGL / Re: Libretro porting of LWJGL (and Libgdx)
« Last post by msx on January 24, 2020, 09:55:37 »
Holy cow it worked! You were right, huge thanks!


Here's the (probably) first time a java Core runs on RetroArch:



6
OpenGL / Re: Libretro porting of LWJGL (and Libgdx)
« Last post by msx on January 24, 2020, 09:45:18 »
Thanks, i'll try that GL.createCapabilities() right away.

Yesterday i made a small progress with this code, that worked correclty:

Code: [Select]
funcAddr = APIUtil.apiGetFunctionAddress(GL.getFunctionProvider(), "glClearColor");
JNI.callV(funcAddr, 0f, 1f, 0f, 0f);

I was already dreading rewriting all GLxx classes with that style of calls.
I'm not sure i understood that threadlocal lookup things. All opengl calls i used with "apiGetFunctionAddress" and "JNI" are working even if i store the address on a field. Probably the context is not changed and it's always the same thread.
7
OpenGL / Re: Libretro porting of LWJGL (and Libgdx)
« Last post by spasi on January 24, 2020, 09:00:57 »
Hey msx,

You're likely missing a call to GL.createCapabilities(). This will detect the OpenGL context that is current in the current thread and configure the LWJGL side of things accordingly. You should then be able to do normal OpenGL calls without issues.

Also note that GL.createCapabilities() is a heavyweight operation and should only be done once per context, not every time you enter from C to Java. If you're moving the context between threads, the returned GLCapabilities object can be reused by calling GL.setCapabilities().

WGL seems to be looking at function pointer and calling native "callPPI" style functions (this is black magic to me), while GL stuff seems to be using regular native functions like "static native void glClearColor" in GL11C.

OpenGL functions are called using static methods. However, OpenGL function addresses are specific to the context that is bound to the current thread. Normally, this would require a thread-local lookup on each method call, which is not terribly expensive, but it adds up and also affects the optimization of surrounding code. Unfortunately, Hotspot cannot hoist the lookup outside of loops, etc. That's why LWJGL handles OpenGL calls differently, to avoid the thread-local lookup completely without sacrificing correctness. This is indeed black magic (i.e. nasty hack that messes with JVM internals), see the ThreadLocalUtil class for more details.
8
OpenGL / Re: Libretro porting of LWJGL (and Libgdx)
« Last post by msx on January 23, 2020, 13:40:14 »
Ops i did quote instead of modify and created a new reply, sorry :P
9
OpenGL / Libretro porting of LWJGL (and Libgdx)
« Last post by msx on January 23, 2020, 13:39:15 »
Hi there, i'm trying to implement a Core for Libretro/Retroarch based on libgdx (eventually LWJGL). I asked for help on libgdx forum and they suggested posting here too.

Premise:

Quote
Libretro/Retroarch is a generic "game system" that can execute "cores". Each core is an emulator, or a virtual machine (like there's a "Mame" core, a "dosbox" core, etc Doesn't need to be an emulator too, some games are relased as "cores" ). The infrastructure provides for all window and input management, the core has to specify its desired resolution and has a callback method to render the scene (simplifying a lot). Audio, input, etc are unified for all cores. Cores are actually just dll/so with specific callback functions (obviously they can rely on external files). Most important callbacks are "init" (that inizialize the core) and "run" (which is called every frame and should render the scene).

Also, cores come in two variant: the classic variant draw graphics on a generic "ram buffer" that's then sent to the screen (like old 8 bit system), while the opengl variant is opengl-aware and receive an opengl framebuffer onto which to render. (this is becouse opengl is so ubuquitous that they consider skipping a layer of abstraction for performances).
The main gotcha of the opengl core is that you have to render not on the screen but on a framebuffer that is provided to you by libretro infrastructure (so that it can render its own gui on top etc). This shouldn't be a huge deal since, once bound, rendering on the framebuffer should be pretty transparent.


So what i want to do is develop a Core that can run "java games". It would be a c dll/so program that: starts an embedded jvm, load some java "middleware" to setup the system, and then load a jar with a to-be-defined "game format", wiring all callbacks from libretro to methods on a java object.

Now as a proof of concept i was able to do most of the stuff: i took a libretro gl-core demo, added JVM management, loading a jar and calling methods during libretro various backcalls, including the "run" call that's done on each frame.

The problem is: any call to Opengl functions crash :)

Now, inside the java callback we should already be in a context where we can draw: there's a framebuffer bound and rendering ongoing. For reference, this is the C portion of the libretro demo that makes the java call:

Code: [Select]
   ...
   ...
   glBindFramebuffer(RARCH_GL_FRAMEBUFFER, hw_render.get_current_framebuffer());

   glClearColor(0.3, 0.4, 0.5, 1.0);
   glViewport(0, 0, width, height);
   glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);

   doJavaCall(); // here, the java side is called.

   glUseProgram(prog);

   glEnable(GL_DEPTH_TEST);
   ...
   ...

For the java side, i've tryed calling glClearColor and glClear again, with a different color to have something to actually verify that it's working. But as i said, it crashed.

Investigating, i ended up with the following code on the java side:

Code: [Select]
      System.out.println("****************CHIAMATO DOCALL***************");
      Configuration.DEBUG.set(true);
      System.out.println("Context: "+WGL.wglGetCurrentContext());
      System.out.println("Device: "+WGL.wglGetCurrentDC());
      System.out.println("lolTest: "+org.lwjgl.opengl.GL11.class); // ensure class has initializers done
      System.out.println("OK");
     
      String s = org.lwjgl.opengl.GL11.glGetString(org.lwjgl.opengl.GL11.GL_VERSION);
      System.out.println("Version is "+s);

Now this code, when i run retroarch, load the core and execute it, produces the following output:

Code: [Select]
calling doCall
****************CHIAMATO DOCALL***************
[LWJGL] Version: 3.2.3 SNAPSHOT
[LWJGL] OS: Windows 10 v10.0
[LWJGL] JRE: 13.0.1 amd64
[LWJGL] JVM: OpenJDK 64-Bit Client VM v13.0.1+9 by AdoptOpenJDK
[LWJGL] Loading JNI library: lwjgl
[LWJGL] Module: org.lwjgl
[LWJGL] Loaded from org.lwjgl.librarypath: C:\Users\niclugat\AppData\Local\Temp\lwjglniclugat\3.2.3-SNAPSHOT\lwjgl.dll
[LWJGL] Loading JNI library: lwjgl_opengl
[LWJGL] Module: org.lwjgl.opengl
[LWJGL] Loaded from org.lwjgl.librarypath: C:\Users\niclugat\AppData\Local\Temp\lwjglniclugat\3.2.3-SNAPSHOT\lwjgl_opengl.dll
[LWJGL] Loading library: opengl32
[LWJGL] Module: org.lwjgl.opengl
[LWJGL] opengl32.dll not found in org.lwjgl.librarypath=C:\Users\niclugat\AppData\Local\Temp\lwjglniclugat\3.2.3-SNAPSHOT
[LWJGL] Loaded from system paths: C:\WINDOWS\SYSTEM32\OPENGL32.dll
[LWJGL] Loading library: jemalloc
[LWJGL] Module: org.lwjgl.jemalloc
[LWJGL] Loaded from org.lwjgl.librarypath: C:\Users\niclugat\AppData\Local\Temp\lwjglniclugat\3.2.3-SNAPSHOT\jemalloc.dll
[LWJGL] MemoryUtil allocator: JEmallocAllocator
Context: 131072
Device: 251730347
lolTest: class org.lwjgl.opengl.GL11
OK


Now there's some output of LWJGL tellin that it's loading the library, and looks ok.
Then the two WGL calls actually works, and seems to tell that on the java side we actually ARE on a valid opengl context. After the "OK" log i've tryed many opengl calls and they all crash.

Now, since you guys are the wizards of dll loading, perhaps can help me.

My wild guess is that (since libretro is already up and running and lwjgl found all its natives) the java side is loading his own opengl dll and there's some kind of mismatch with what's already loaded and running on Libretro.

Also i found it strange that the WGL calls do indeed works, so i went to look at the sources and i'm under the impression that in LWJGL, WGL and GLxx uses two different approaches: WGL seems to be looking at function pointer and calling native "callPPI" style functions (this is black magic to me), while GL stuff seems to be using regular native functions like "static native void glClearColor" in GL11C. (can someone confirm this supposition?, Also, is there some docs on how all that magic works?)

If so, perhaps the first approach somehow bypass whatever issue is blocking the GL side.

So if anybody have some idea.. Does some "function pointer" styled GL wrapper exists ? Or can be created, perhapse just for glClearColor and glClear ?
10
OpenGL / Flickering visual artifacts on some machines
« Last post by Danjb on January 22, 2020, 08:08:01 »
Hi,

I see some strange flickering visual artifacts on some machines, regardless of the resolution / fullscreen / vsync settings.

Is this likely due to the graphics driver not supporting vsync, or could it be that my rendering code is running too slowly to keep up with the refresh rate, and needs optimising?

Is there a way around this problem for graphics drivers that do not support vsync?

Thanks,
Dan
Pages: [1] 2 3 ... 10