For my university senior design project, my team has been tasked with creating an autonomous remote control lawnmower that can also be piloted with a controller and a VR headset. My current goal is to stream from a Theta V to a Linux server, process the feed (to use OpenCV or add HUD elements), and push the final video to a VR headset in real time with a reasonable amount of latency.
As of now, I’ve been able to retrieve an RTSP stream on my Linux laptop via the RTSP plugin and a Python script. With the script, I can also use OpenCV and display the final result with low latency.
I’m currently stumped on how I should livestream this feed to a VR headset. What would be a good way to approach this? The headset hasn’t been finalized yet, so it can be an Oculus Rift, a Vive, a Phone w/ Cardboard, or any other solution, whichever is easiest.
About the first post, I found their post earlier, but I’m not sure if it will work for my project. I tested their amelia viewer and it was displaying at a low framerate with high latency. His post also mentioned they only achieved an 8 FPS stream. When I ran my python script over the network, it was able to run much faster.
Perhaps I didn’t configure it correctly, but I don’t know if I should try to reverse engineer it if it may not work for my usage. I don’t have much experience with node js, so I’m not entirely sure how to get his code to work with OpenCV as he didn’t seem to have documentation.
For the 2nd post, that does look promising. I’ll look into that, but my problem is that I cannot directly connect the Theta V to my PC via USB. I have to use Ubuntu 18.04 to run other ROS functionalities, but it doesn’t support UVC 1.5. Because of this, I have to get the stream via RTSP. Plus I need to process it in Python first (although I can try porting it to OpenCV w/ C++ if it works better). I’ll have to see if there is a way to output a stream from my program to Unity.
Did you ever find a solution to your project? I have a similar project of my university and I my components are HTC Vive, Unity and a UGV for teleoperation. The plan is to buy a Theta V as soon as I find a solution.
You may want to collect more information before you buy the camera. There are latency issues with telepresence and there are different issues with viewing the stream in a headset.
You can get either the camera by itself or through a Linux relay like a Jetson to stream RTSP or WebRTC and get it to a headset, but there is no single document on this process. The closest is like the Amelia Drone project, though Koen below is using it with a headset.