How can I stream an OpenGL framebuffer

Started by TheMode, February 26, 2020, 23:20:08

Previous topic - Next topic

TheMode

My current project is to allow multiple types of rendering including streaming. My plan is basically to use H.264 for that but I do not know how to do it in a cross-platform way for LWJGL, NvPipe from NVIDIA represents perfectly what I want: https://github.com/NVIDIA/NvPipe (or even CUDA, can I use NvEncoder?) it allows me to directly compress a texture in the GPU and decompress it without a bunch of CPU traveling. Problem is it is limited for NVIDIA only (I don't mind for the encoding part)

What are my solutions for the decoding one? JavaCV seems to be interesting but will it support LWJGL? (and unfortunately, I guess that I will lose some performance by having to manually update the texture)
Obviously I'm also subject to change for the encoding method (even the file format), in the end I just want to be able to stream within 20-30ms of latency if possible