Successful Theta V stream from drone to VR headset 0.25 miles away

livestreaming
drone
headset
#1

Just figured I would drop a little project update here letting people know about the senior design project I worked on that used a Ricoh Theta V. I know @codetricity and others on here would definitely be interested in it, since I have seen people talking about sending Ricoh Theta V video from a drone, but as far as I know noone has demonstrated anything yet.

We built a system that would allow a ground operator to view a livestream video from a drone in their VR headset. The system will send the entire 360 degree video to the ground station where it is recorded and streamed live to the pilot wearing a VR headset. During testing we were able to get 1920x960 video streamed to our ground station a quarter mile away with less than 200ms of lag! The payload contains all the transmission and filming hardware, and can be mounted on to any drone easily (so long as it can lift a ~3kg payload).

Here are some demo videos. You will notice that FT1 is broken up. That is because in our excitement we forgot to keep the ground antenna pointed at the drone. But we were able to reconnect to it while it was in the air and continue streaming and recording:

System Overview
You can see a quick pic of the setup in this photo. The camera is mounted in a DJI Osmo Mobile 2 gimbal, which is then attached to our 3D printed payload. The white thing on the other side of the payload is the transmission antenna we used. The antenna is an Ubiquiti Omni antenna that is connected to an Ubiquiti Rocket 5AC Lite. An Omnicharge 13 (power for the hardware) and the PoE adapter for the Ubiquiti hardware are stuffed inside the gray payload.

image

We used Ubiquiti transmission hardware so we could transmit all the data we needed at long ranges. On the ground station we had an Ubiquiti NanoBeam 5AC that we would point at the drone to make sure we had good signal strength. These two pieces of hardware act as a point-to-point bridge that connects the ground station PC to the camera on a single network. We connected the Theta V to the Rocket 5AC using a USB-to-Ethernet adapter as has been mentioned on here before. We used a setup similar to the one shown here so that we could charge the camera while using it and have basically infinity recording time.

On the ground station PC we used some software I wrote that is freely available on GitHub.

https://github.com/dirtshell/amelia_viewer

Here is a demo video of the software. Please disregard my messy room :sweat_smile:

The software does the following:

  • Communicate with the camera using the Ricoh API
  • Stream up to 1920x960 live from the camera
  • Record video while also displaying it
  • Playback recorded videos
  • Display live or pre-recorded videos in a VR headset (Oculus/Vive)

Future Work
TBH the drone platform I had to build is pretty rock solid and so is the transmission system, but obviously better transmission hardware could be purchased, and the payload’s weight could be decreased. The current gimbal may also be overkill. Most of the improvements that can be made involve the software.

The first big hiccup in the software is that while I built it originally to use Electron, it turns out the fork of Chromium Electron is based around did not support the OpenVR bindings the Oculus Rift when I wrote it (I still think it doesn’t). So even though you can run everything just in Electron, I ended up having to just host the UI using an NPM app that is started in the Electron app and then view the actual UI in Firefox, and then things worked. Unfortunately this is sort of a major hold up because some of the desktop-oriented features of Electron cannot be used in the browser. But when Electron supports the Oculus Rift I will port everything back in to the Electron app so you don’t have to open Firefox.

I also had to write this code in a very short amount of time (~2ish weeks) so I had to settle for the official Ricoh API, which is buggy and has limited options. For instance, I can’t stream in 4K at all, and my framerate for 1920x960 is limited to 8 FPS (bleh). I also am not a big fan of streaming using MJPEG, since my sketchy “player” doesn’t handle dropped frames well and and the format isn’t optimized for limited bandwidth conditions. Also, you will notice that for FT pt 2, the video is stuttery. That is because of a bug in the Ricoh API afaik, because you get the same weird stuttering in the official Ricoh preview app. But maybe their video player is just as bad as mine. So future work would involve writing a custom camera plugin that allows me to stream in 4K using H.265.

The UI is also less than stellar. When things occasionally fail, they fail silently. I currently just dump all my messages and warnings to the console’s log. I need to display pop up messages to let the user know what is going on. The video pumped in to the headset is also just the video from the camera. Ideally I would present the user with all sorts of telemetry data that helps them fly effectively and make sure they know what they are doing (drone battery percentage, heading, orientation, speed, range, etc). I use A-frame to get VR functionality, and it is relatively simple to add these kind of things to the video stream AFAIK. I would also like to “auto orient” the camera’s view using the camera’s accelerometer. Right now you have to manually specify the orientation of the video, or else the video’s orientation will correspond to the orientation of the camera (which in our case is sideways lol).

Conclusion
Overall the project was a great success though! I can’t tell you how cool it was to fly this thing using the VR headset. It really felt like the future. There are no systems that do what we do (at least that have been publicly demonstrated). Some companies try to get a similar effect by using a gimbal synced up to a headset, but they aren’t actually streaming and recording the entire 360 video, just a single FOV. This technology would be massively useful for search and rescue operations because multiple people can be viewing the feed at the same time. It would also be great for surveying and surveillance. There were literally no other cameras that could’ve done this project. It was only possible because the Theta has a dev community and all this documentation. The only other camera on the market I could even find capable of streaming live video at all was a Garmin Virb, but that only did a single hemisphere and there was no proof-of-concept code or footage showing it streaming live video over a network.

Unfortunately I had to hand over all the hardware at the end of the semester though so I don’t have any of the hardware anymore and won’t be able to write the new plugin. Maybe after I start working in October and have some cash I will pick up a Theta V though. In the meantime I will probably work on including telemetry in the viewer.

I really think this is the future of drone flight, and I am really happy I found a cool community like this that had all the info and resources I needed to make this project a success :grinning:

Enjoy this video showing a recording we made in 4K. You can view the video in 360 on YouTube. This kind of video quality is the goal. With some more work I believe it is possible.

If you have any questions hit me up! I would be more than happy to answer any questions about the setup!

3 Likes
How To: Connect THETA V with Ethernet
Wireless live stream to VR hmd
Live streaming 360 VR video direct to iphone
#2

This is fantastic, I put this out on the Facebook User Group to get more feedback on this wonderful project.

There’s some comments and discussion here.

image

For this future project:

So future work would involve writing a custom camera plugin that allows me to stream in 4K using H.265.

I would like to put this idea up on this ideas database:

http://ideas.theta360.guide/

#3

@Jake_Kenin, this is fantastic! Really fun to see the videos - especially the “2019-04-17 Flight Test: Recorded on board video” - and dig through all the details. I have some things that I’m curious about:

  1. Why did you choose to build this? You say it’s for your RIT Senior Design Project. Can you tell me a little bit more about that?

  2. Do you have more pictures of the rig? I’m interested in seeing how the THETA is housed in the gimbol. And also the base station antenna.

  3. Is it both recorded to the THETA and live streaming to the VR headset? Or is it one or the other at a time?

  4. You stated that it gets “less than 200ms of lag.” Is that mostly due to the Ubiquiti Omni antenna? What software components would you say are most important for reducing lag?

  5. Did you test with more than one person viewing at a time? It’s capable of that?

  6. In the flight test video, i noticed the guy in yellow (not controlling the drone, as far as I can tell) starts sitting/lying down at around 4:22. That’s not related to the test in anyway, correct? Or is it?

Thanks so much for posting the details and results and all the videos. Your enthusiasm in the conclusion is infectious. :slight_smile:

#4

@codetricity, my testing revealed that RTMP and RTSP could deliver smooth and high resolution video (tested using the WebAPI plugin and VLC), but it was too laggy (greater than 1 second) to be used safely with a drone moving at high speeds. Meanwhile, if I use MJPEG with the Ricoh API I can prioritize the most recent frame to make sure I minimize the delay (~250 ms at 1920x960 @ 8fps at 0.25 miles, ~100 ms at 1024x512 @ 30fps at 0.25 miles). Sorry I don’t have any actual benchmarks on hand. If I were to write the H.265 plugin I would make sure there was a way to synchronize the clocks so I could properly timestamp each frame and accurately measure the delay. Also, thanks for promoting this on the ideas page :grin:

@jcasman

  1. Why did you choose to build this? You say it’s for your RIT Senior Design Project. Can you tell me a little bit more about that?

For our senior year, engineering students participate in a senior design project. I participated in the multidisciplinary senior design course, where students from different engineering majors work together on a sponsored project. Projects are proposed by either students, faculty members, or outside companies. Then the faculty assigns students to projects based on their co-op / internship experience. This project was proposed and sponsored by Lockheed Martin, where I am sure this project will sit on a shelf while smaller and faster companies perfect it.

  1. Do you have more pictures of the rig? I’m interested in seeing how the THETA is housed in the gimbol. And also the base station antenna.

Here is a GIF of the gimbal with the camera adapter in it.

Here it is from another angle.

image

Here is the exploded view of the payload.

The base station setup is realllly ghetto. It consists solely of a NanoBeam 5AC connected to the base station via an ethernet cable. We literally just had a team member aim the antenna at the drone. Since the entire setup was made to work offline, a simple DHCP server was ran on the base station and that DHCP server assigned an IP to the camera.

  1. Is it both recorded to the THETA and live streaming to the VR headset? Or is it one or the other at a time?

Unfortunately no, you can only do one or the other. If you are streaming video via the Ricoh API, you cannot also be recording video to the camera’s storage. But that is another limitation of the using the Ricoh API. You should be able to do both simultaneously with a custom plugin. That said, the stream that is seen in the VR headset is automatically recorded to the base station PC.

  1. You stated that it gets “less than 200ms of lag.” Is that mostly due to the Ubiquiti Omni antenna? What software components would you say are most important for reducing lag?

From my VERY amateur research there are two things that go a long way to reduce lag:

  • Protocol and implementation: This goes a long way. Some implementations have a lot of overhead. The usual tradeoff is video smoothness vs resolution vs framerate vs latency. I can’t think of any “specific” software components that would speed things up, because my software implementation is pretty basic. I just receive video from the Ricoh API treated as a black box, grab each frame, and then update an A-Frame videosphere. I haven’t done any specific testing regarding delays on the A-Frame side, since I was short on time and sort of stuck with the Ricoh API.
  • Transmission Hardware: You want as much gain as possible regardless of the orientation of the drone with respect to the ground antenna (hence the omni antenna on the drone). The latency across the network was only ~ 1ms. You want minimal network latency with as much bandwidth as possible.
    Getting this kind of signal strength at a distance on a moving target is difficult. So in effect we usually only saw a throughput of ~40 Mbps, and the latency would vary. Ubiquiti uses custom comms protocols to establish high throughput at range on stationary targets, so that explains why the performance varies so much on our moving target.

I didn’t test any other transmission hardware unfortunately, but I would imagine most other back-haul network hardware would perform comparably to the Ubiquiti hardware. When I tested my streaming software using just the built in WiFi adapter on the Theta, I got similar performance to when I used the Ubiquiti hardware. That said, the built in WiFi on the Theta wouldn’t have been able to get the range we needed.

  1. Did you test with more than one person viewing at a time? It’s capable of that?

Nope. But you should able to, but due to VR system limitations you won’t be able to do it all from one PC. I haven’t thought through the details.

  1. In the flight test video, i noticed the guy in yellow (not controlling the drone, as far as I can tell) starts sitting/lying down at around 4:22. That’s not related to the test in anyway, correct? Or is it?

Nope, I think he was going to pick up a camera being used for filming lol.

1 Like
#5

This project was selected as “Project of the Month”

It now has a special web page.

and is featured on the front page of the theta360.guide site. :slight_smile:

In addition, I’m planning to do a writeup on Code Project. Will provide full attribution to Jake.

We’ve been trying to solve this problem for a while. Thought it would be good to get the word out widely.

2 Likes
#6

HUZZAH! I am honored =)

I really have to say, it was only possible because the community was so open and the SDK was readily available and easy to use. I hope to inspire others to try this out. The cost of the payload and transmission system was ~$1k, so it isn’t cheap, but definitely within experimentation range.

image

2 Likes
#7

Great Job !

I’m using my Theta in USB because with Wi-Fi On ,it’s not possible to maintain the battery level. But I loose the preview MJPEG live-stream. I Think I will try a Y cable and Ethernet.

2 Likes
#8

This is a really cool project!

Drone augmentation has been a secret passion of mine.

I know that the 5ghz radio is a leaner signal than the 2.4ghz signal. Was this choice intentional?

There could be some range gained if you could went with a directional antenna too. Sounds like you learned what I learned about Omni antennas and making sure they are angled correctly!

The drone itself looks like a DJi Matrice. Is that your personal machine or does that belong to the RTI?

2 Likes
#9

@Geospatial So we knew we were going to be doing all of our testing and POC in an open field, so we didn’t really need the robust signal strength you get from 2.4 GHz (no obstacles to worry about in an open field). Also, our control radio uses 2.4 GHz, and if we opted to use an FPV camera as a backup for flying the drone, a lot of those use 2.4 GHz as well.

We actually ended up placing an omni-directional antenna on the drone itself, and a directional antenna on the ground station. I thought about using two directional antennas. Since the orientation of drones tends to move a lot, keeping the two directional antennas pointed at each other would be a difficult feat. I would have needed to setup at least a single axis servo for rotating the antenna, and then setup some small microcrontroller that would control the servo based on the GPS location of the ground station and the drone’s location. We figured that setup would have been a pain to get working alongside all the other work we had to do.

Getting a drone was sort of a PITA. First I had to explain to our guide that $4k wasn’t enough to buy a beefy base station laptop, a heavy-lift drone, transmission hardware, a camera, controls for flying the drone, a VR headset, and then have some spare for repairs and other incidental costs. I finally managed to bump things up to $5k, but not before DJI officially discontinued the S900 and the market for them dried up. Our school program’s policy was you needed to use an American retailer, meaning we couldn’t use the uber cheap drones from China. So I actually had to build this one up using a cheap heavy-lift drone frame (GAUI 950H) from a small American RC shop. Bought the frame, spent some time on GetFPV getting motors, props, ESCs, radios, PX4, the works. Once I had it assembled it worked great, and honestly I am glad I went with a PX4 based drone because it made things easier to configure and debug compared to using a NAZA or some other controller. So I was pretty sad when I had to hand it over to RIT =(

#10

Article on CodeProject just came out. Already has 1.4K views. Strong interest in Jake Kenin’s cool project. :slight_smile:

2 Likes
#11

I gave it a 5 out of 5 stars!

If you get a chance, read @codetricity’s article, and give it your rating or a comment.

2 Likes