Wanted to share some experiments I have been doing lately on Pixel Streaming with Cinder through GStreamer and WebRTC.
The main motivation is the ability to have remote servers doing all the heavy-weight rendering that can be then streamed and displayed, in real time ( < 500ms delay ), to remote peers independent of their device rendering capabilities.
In order to achieve real-time transmissions the encoding of the stream happens on the GPU, when there is hardware support available, through dedicated GStreamer elements ( e.g nvenc, omx, vaapi etc ) that can be plugged in the pipeline.
You can find more info about the idea and how it works here https://github.com/PetrosKataras/Cinder-GstWebRTC in case you want to give it a try. Keep in mind that parts of the WebRTC specification is very much in flux so depending on browser, network settings, versions etc things can produce different results.
Here is a short video of the block in action ( …twitter compressed… ) running a rendering server on an NVIDIA Jetson Nano and streaming the pixels to a macOS Firefox client over LAN