For my university senior design project, my team has been tasked with creating an autonomous remote control lawnmower that can also be piloted with a controller and a VR headset. My current goal is to stream from a Theta V to a Linux server, process the feed (to use OpenCV or add HUD elements), and push the final video to a VR headset in real time with a reasonable amount of latency.
As of now, I’ve been able to retrieve an RTSP stream on my Linux laptop via the RTSP plugin and a Python script. With the script, I can also use OpenCV and display the final result with low latency.
I’m currently stumped on how I should livestream this feed to a VR headset. What would be a good way to approach this? The headset hasn’t been finalized yet, so it can be an Oculus Rift, a Vive, a Phone w/ Cardboard, or any other solution, whichever is easiest.
Any recommendations would be greatly appreciated.