Can someone give me a super basic example of how to render some images to a HTC Vive using LWJGL, GLFW and OpenVR?
I get the VRCompositor to show me the default-grid with the controllers and basestations, but i'm not able to add anything to it.
I found an example, where the GLFW window is rendered to the Vive. He used
GraphicsEnvironment.getLocalGraphicsEnvironment().getScreenDevices();
to identify the HTC Vive as a monitor.
In a comment he wrote, that LWJGL3 doesn't support this anymore
. So i tried to identify the Vive with the GLFW method
glfwGetMonitors();
The result is an array of size 1 though (so he only finds the Default Monitor attached to the computer).
In a C++ OpenVR example i found this:
vr::Texture_t leftEyeTexture = {(void*)leftEyeTex, vr::API_OpenGL, colorSpace};
vr::Texture_t rightEyeTexture = {(void*)rightEyeTex, vr::API_OpenGL, colorSpace};
vr::VRCompositor()->Submit(vr::Eye_Left, &leftEyeTexture);
vr::VRCompositor()->Submit(vr::Eye_Right, &rightEyeTexture);
leftEyeTex and rightEyeTex where created using openGL.
So i tried to do this in Java:
Texture texture = Texture.create();
texture.set(modelTexture.getID(), ETextureType_TextureType_OpenGL, EColorSpace_ColorSpace_Auto);
VRCompositor.VRCompositor_Submit(EVREye_Eye_Left, texture, null, VR.EVRSubmitFlags_Submit_GlRenderBuffer);
VRCompositor.VRCompositor_Submit(EVREye_Eye_Right, texture, null, VR.EVRSubmitFlags_Submit_GlRenderBuffer);
texture is an OpenVR Texture, modelTexture was created with OpenGL.
This didnt work either
Thanks in advance!