Hi guys, my setup is very similar to several people above - I’m streaming a Ricoh Theta V connected to a Jetson Xavier via USB. I’m using the thetauvcsrc from Nickel110 combined with a gstreamer pipeline based on this handy example.
I can view the stream in 4K in DirectX on my WLAN connected Windows machine using another simple gstreamer (for Windows) pipeline, and there is an acceptably small latency (<0.3s I would guess).
Instead, however, I want to render the stream into a skybox in Unity on a Windows machine, in a similar way to this example, but with an RTP stream, rather than a Theta that is directly connected to the Windows machine via USB. I would obviously like to keep latency to a minimum still.
Does anyone have this final piece of the puzzle? Or have any ideas about how to go about it?
My best ideas so far are:
- Use VLC, OBS or Gstreamer to ‘loopback’ (if that’s the right term) the RTP stream to a dummy Windows webcam device (as per v4l2loopback ) and then use an identical script to the Unity example above.
- Access and decode the H264 encoded RTP stream directly from a C# script in Unity and render that to the skybox texture - is that possible?
- Use a different pipeline on the Xavier side to make the Windows/Unity side simpler, or decode the stream on the Windows side using gstreamer and somehow pass that stream to Unity for rendering.
If you have any thoughts or ideas then let me know. I will also post an update if I manage to succeed!