How to livestream a low latency Theta V 360 feed from a Linux system to a VR Headset?

For my university senior design project, my team has been tasked with creating an autonomous remote control lawnmower that can also be piloted with a controller and a VR headset. My current goal is to stream from a Theta V to a Linux server, process the feed (to use OpenCV or add HUD elements), and push the final video to a VR headset in real time with a reasonable amount of latency.

As of now, I’ve been able to retrieve an RTSP stream on my Linux laptop via the RTSP plugin and a Python script. With the script, I can also use OpenCV and display the final result with low latency.

I’m currently stumped on how I should livestream this feed to a VR headset. What would be a good way to approach this? The headset hasn’t been finalized yet, so it can be an Oculus Rift, a Vive, a Phone w/ Cardboard, or any other solution, whichever is easiest.

Any recommendations would be greatly appreciated.

1 Like

Would this work with WebVR? You can easily implement webvr with A-frame.

You can also wire the headset to the HTC Vive

1 Like

About the first post, I found their post earlier, but I’m not sure if it will work for my project. I tested their amelia viewer and it was displaying at a low framerate with high latency. His post also mentioned they only achieved an 8 FPS stream. When I ran my python script over the network, it was able to run much faster.

Perhaps I didn’t configure it correctly, but I don’t know if I should try to reverse engineer it if it may not work for my usage. I don’t have much experience with node js, so I’m not entirely sure how to get his code to work with OpenCV as he didn’t seem to have documentation.

For the 2nd post, that does look promising. I’ll look into that, but my problem is that I cannot directly connect the Theta V to my PC via USB. I have to use Ubuntu 18.04 to run other ROS functionalities, but it doesn’t support UVC 1.5. Because of this, I have to get the stream via RTSP. Plus I need to process it in Python first (although I can try porting it to OpenCV w/ C++ if it works better). I’ll have to see if there is a way to output a stream from my program to Unity.

1 Like

a guy from the community is working on a usermode driver Linux to support THETA streaming with UVC 1.5. I think it’s close to being made public.

I have not installed Unity for Linux, but it seems to be available.

This is a different approach using VP9 and Unity embedded browser.

IMO, best chance of quick success is to connect the HTC Vive to your computer with a USB cable.

2 Likes

Hello,

You shoud try “Device WEB API Plug-in”, just have a look on the VR view (for japanese use your phone with dynamic translation from camera…).

VR headset is useless for my project but if I had to do it :
RTSP => webRTC => Play the stream on browser inside the VR headset

Work with tree.js, html canvas to change the view (maybe copy some javascript code from “Device WEB API Plug-in”)

This should work on any VR heaset

Hugues

3 Likes

https://en.device-webapi.org/

Hello @Emertxe!

Did you ever find a solution to your project? I have a similar project of my university and I my components are HTC Vive, Unity and a UGV for teleoperation. The plan is to buy a Theta V as soon as I find a solution.

Looking forward your response

1 Like

You may want to collect more information before you buy the camera. There are latency issues with telepresence and there are different issues with viewing the stream in a headset.

You can get either the camera by itself or through a Linux relay like a Jetson to stream RTSP or WebRTC and get it to a headset, but there is no single document on this process. The closest is like the Amelia Drone project, though Koen below is using it with a headset.

1 Like