Successful Theta V stream from drone to VR headset 0.25 miles away

Just figured I would drop a little project update here letting people know about the senior design project I worked on that used a Ricoh Theta V. I know @codetricity and others on here would definitely be interested in it, since I have seen people talking about sending Ricoh Theta V video from a drone, but as far as I know noone has demonstrated anything yet.

We built a system that would allow a ground operator to view a livestream video from a drone in their VR headset. The system will send the entire 360 degree video to the ground station where it is recorded and streamed live to the pilot wearing a VR headset. During testing we were able to get 1920x960 video streamed to our ground station a quarter mile away with less than 200ms of lag! The payload contains all the transmission and filming hardware, and can be mounted on to any drone easily (so long as it can lift a ~3kg payload).

Here are some demo videos. You will notice that FT1 is broken up. That is because in our excitement we forgot to keep the ground antenna pointed at the drone. But we were able to reconnect to it while it was in the air and continue streaming and recording:

System Overview
You can see a quick pic of the setup in this photo. The camera is mounted in a DJI Osmo Mobile 2 gimbal, which is then attached to our 3D printed payload. The white thing on the other side of the payload is the transmission antenna we used. The antenna is an Ubiquiti Omni antenna that is connected to an Ubiquiti Rocket 5AC Lite. An Omnicharge 13 (power for the hardware) and the PoE adapter for the Ubiquiti hardware are stuffed inside the gray payload.

image

We used Ubiquiti transmission hardware so we could transmit all the data we needed at long ranges. On the ground station we had an Ubiquiti NanoBeam 5AC that we would point at the drone to make sure we had good signal strength. These two pieces of hardware act as a point-to-point bridge that connects the ground station PC to the camera on a single network. We connected the Theta V to the Rocket 5AC using a USB-to-Ethernet adapter as has been mentioned on here before. We used a setup similar to the one shown here so that we could charge the camera while using it and have basically infinity recording time.

On the ground station PC we used some software I wrote that is freely available on GitHub.

Here is a demo video of the software. Please disregard my messy room :sweat_smile:

The software does the following:

  • Communicate with the camera using the Ricoh API
  • Stream up to 1920x960 live from the camera
  • Record video while also displaying it
  • Playback recorded videos
  • Display live or pre-recorded videos in a VR headset (Oculus/Vive)

Future Work
TBH the drone platform I had to build is pretty rock solid and so is the transmission system, but obviously better transmission hardware could be purchased, and the payload’s weight could be decreased. The current gimbal may also be overkill. Most of the improvements that can be made involve the software.

The first big hiccup in the software is that while I built it originally to use Electron, it turns out the fork of Chromium Electron is based around did not support the OpenVR bindings the Oculus Rift when I wrote it (I still think it doesn’t). So even though you can run everything just in Electron, I ended up having to just host the UI using an NPM app that is started in the Electron app and then view the actual UI in Firefox, and then things worked. Unfortunately this is sort of a major hold up because some of the desktop-oriented features of Electron cannot be used in the browser. But when Electron supports the Oculus Rift I will port everything back in to the Electron app so you don’t have to open Firefox.

I also had to write this code in a very short amount of time (~2ish weeks) so I had to settle for the official Ricoh API, which is buggy and has limited options. For instance, I can’t stream in 4K at all, and my framerate for 1920x960 is limited to 8 FPS (bleh). I also am not a big fan of streaming using MJPEG, since my sketchy “player” doesn’t handle dropped frames well and and the format isn’t optimized for limited bandwidth conditions. Also, you will notice that for FT pt 2, the video is stuttery. That is because of a bug in the Ricoh API afaik, because you get the same weird stuttering in the official Ricoh preview app. But maybe their video player is just as bad as mine. So future work would involve writing a custom camera plugin that allows me to stream in 4K using H.265.

The UI is also less than stellar. When things occasionally fail, they fail silently. I currently just dump all my messages and warnings to the console’s log. I need to display pop up messages to let the user know what is going on. The video pumped in to the headset is also just the video from the camera. Ideally I would present the user with all sorts of telemetry data that helps them fly effectively and make sure they know what they are doing (drone battery percentage, heading, orientation, speed, range, etc). I use A-frame to get VR functionality, and it is relatively simple to add these kind of things to the video stream AFAIK. I would also like to “auto orient” the camera’s view using the camera’s accelerometer. Right now you have to manually specify the orientation of the video, or else the video’s orientation will correspond to the orientation of the camera (which in our case is sideways lol).

Conclusion
Overall the project was a great success though! I can’t tell you how cool it was to fly this thing using the VR headset. It really felt like the future. There are no systems that do what we do (at least that have been publicly demonstrated). Some companies try to get a similar effect by using a gimbal synced up to a headset, but they aren’t actually streaming and recording the entire 360 video, just a single FOV. This technology would be massively useful for search and rescue operations because multiple people can be viewing the feed at the same time. It would also be great for surveying and surveillance. There were literally no other cameras that could’ve done this project. It was only possible because the Theta has a dev community and all this documentation. The only other camera on the market I could even find capable of streaming live video at all was a Garmin Virb, but that only did a single hemisphere and there was no proof-of-concept code or footage showing it streaming live video over a network.

Unfortunately I had to hand over all the hardware at the end of the semester though so I don’t have any of the hardware anymore and won’t be able to write the new plugin. Maybe after I start working in October and have some cash I will pick up a Theta V though. In the meantime I will probably work on including telemetry in the viewer.

I really think this is the future of drone flight, and I am really happy I found a cool community like this that had all the info and resources I needed to make this project a success :grinning:

Enjoy this video showing a recording we made in 4K. You can view the video in 360 on YouTube. This kind of video quality is the goal. With some more work I believe it is possible.

If you have any questions hit me up! I would be more than happy to answer any questions about the setup!

5 Likes

This is fantastic, I put this out on the Facebook User Group to get more feedback on this wonderful project.

There’s some comments and discussion here.

image

For this future project:

So future work would involve writing a custom camera plugin that allows me to stream in 4K using H.265.

I would like to put this idea up on this ideas database:

http://ideas.theta360.guide/

@Jake_Kenin, this is fantastic! Really fun to see the videos - especially the “2019-04-17 Flight Test: Recorded on board video” - and dig through all the details. I have some things that I’m curious about:

  1. Why did you choose to build this? You say it’s for your RIT Senior Design Project. Can you tell me a little bit more about that?

  2. Do you have more pictures of the rig? I’m interested in seeing how the THETA is housed in the gimbol. And also the base station antenna.

  3. Is it both recorded to the THETA and live streaming to the VR headset? Or is it one or the other at a time?

  4. You stated that it gets “less than 200ms of lag.” Is that mostly due to the Ubiquiti Omni antenna? What software components would you say are most important for reducing lag?

  5. Did you test with more than one person viewing at a time? It’s capable of that?

  6. In the flight test video, i noticed the guy in yellow (not controlling the drone, as far as I can tell) starts sitting/lying down at around 4:22. That’s not related to the test in anyway, correct? Or is it?

Thanks so much for posting the details and results and all the videos. Your enthusiasm in the conclusion is infectious. :slight_smile:

@codetricity, my testing revealed that RTMP and RTSP could deliver smooth and high resolution video (tested using the WebAPI plugin and VLC), but it was too laggy (greater than 1 second) to be used safely with a drone moving at high speeds. Meanwhile, if I use MJPEG with the Ricoh API I can prioritize the most recent frame to make sure I minimize the delay (~250 ms at 1920x960 @ 8fps at 0.25 miles, ~100 ms at 1024x512 @ 30fps at 0.25 miles). Sorry I don’t have any actual benchmarks on hand. If I were to write the H.265 plugin I would make sure there was a way to synchronize the clocks so I could properly timestamp each frame and accurately measure the delay. Also, thanks for promoting this on the ideas page :grin:

@jcasman

  1. Why did you choose to build this? You say it’s for your RIT Senior Design Project. Can you tell me a little bit more about that?

For our senior year, engineering students participate in a senior design project. I participated in the multidisciplinary senior design course, where students from different engineering majors work together on a sponsored project. Projects are proposed by either students, faculty members, or outside companies. Then the faculty assigns students to projects based on their co-op / internship experience. This project was proposed and sponsored by Lockheed Martin, where I am sure this project will sit on a shelf while smaller and faster companies perfect it.

  1. Do you have more pictures of the rig? I’m interested in seeing how the THETA is housed in the gimbol. And also the base station antenna.

Here is a GIF of the gimbal with the camera adapter in it.

https://i.imgur.com/U2E7WJB.mp4

Here it is from another angle.

image

Here is the exploded view of the payload.

The base station setup is realllly simple. It consists solely of a NanoBeam 5AC connected to the base station via an ethernet cable. We literally just had a team member aim the antenna at the drone. Since the entire setup was made to work offline, a simple DHCP server was ran on the base station and that DHCP server assigned an IP to the camera.

  1. Is it both recorded to the THETA and live streaming to the VR headset? Or is it one or the other at a time?

Unfortunately no, you can only do one or the other. If you are streaming video via the Ricoh API, you cannot also be recording video to the camera’s storage. But that is another limitation of the using the Ricoh API. You should be able to do both simultaneously with a custom plugin. That said, the stream that is seen in the VR headset is automatically recorded to the base station PC.

  1. You stated that it gets “less than 200ms of lag.” Is that mostly due to the Ubiquiti Omni antenna? What software components would you say are most important for reducing lag?

From my VERY amateur research there are two things that go a long way to reduce lag:

  • Protocol and implementation: This goes a long way. Some implementations have a lot of overhead. The usual tradeoff is video smoothness vs resolution vs framerate vs latency. I can’t think of any “specific” software components that would speed things up, because my software implementation is pretty basic. I just receive video from the Ricoh API treated as a black box, grab each frame, and then update an A-Frame videosphere. I haven’t done any specific testing regarding delays on the A-Frame side, since I was short on time and sort of stuck with the Ricoh API.
  • Transmission Hardware: You want as much gain as possible regardless of the orientation of the drone with respect to the ground antenna (hence the omni antenna on the drone). The latency across the network was only ~ 1ms. You want minimal network latency with as much bandwidth as possible.
    Getting this kind of signal strength at a distance on a moving target is difficult. So in effect we usually only saw a throughput of ~40 Mbps, and the latency would vary. Ubiquiti uses custom comms protocols to establish high throughput at range on stationary targets, so that explains why the performance varies so much on our moving target.

I didn’t test any other transmission hardware unfortunately, but I would imagine most other back-haul network hardware would perform comparably to the Ubiquiti hardware. When I tested my streaming software using just the built in WiFi adapter on the Theta, I got similar performance to when I used the Ubiquiti hardware. That said, the built in WiFi on the Theta wouldn’t have been able to get the range we needed.

  1. Did you test with more than one person viewing at a time? It’s capable of that?

Nope. But you should able to, but due to VR system limitations you won’t be able to do it all from one PC. I haven’t thought through the details.

  1. In the flight test video, i noticed the guy in yellow (not controlling the drone, as far as I can tell) starts sitting/lying down at around 4:22. That’s not related to the test in anyway, correct? Or is it?

Nope, I think he was going to pick up a camera being used for filming lol.

1 Like

This project was selected as “Project of the Month”

It now has a special web page.

and is featured on the front page of the theta360.guide site. :slight_smile:

In addition, I’m planning to do a writeup on Code Project. Will provide full attribution to Jake.

We’ve been trying to solve this problem for a while. Thought it would be good to get the word out widely.

2 Likes

HUZZAH! I am honored =)

I really have to say, it was only possible because the community was so open and the SDK was readily available and easy to use. I hope to inspire others to try this out. The cost of the payload and transmission system was ~$1k, so it isn’t cheap, but definitely within experimentation range.

image

2 Likes

Great Job !

I’m using my Theta in USB because with Wi-Fi On ,it’s not possible to maintain the battery level. But I loose the preview MJPEG live-stream. I Think I will try a Y cable and Ethernet.

2 Likes

This is a really cool project!

Drone augmentation has been a secret passion of mine.

I know that the 5ghz radio is a leaner signal than the 2.4ghz signal. Was this choice intentional?

There could be some range gained if you could went with a directional antenna too. Sounds like you learned what I learned about Omni antennas and making sure they are angled correctly!

The drone itself looks like a DJi Matrice. Is that your personal machine or does that belong to the RTI?

2 Likes

@Geospatial So we knew we were going to be doing all of our testing and POC in an open field, so we didn’t really need the robust signal strength you get from 2.4 GHz (no obstacles to worry about in an open field). Also, our control radio uses 2.4 GHz, and if we opted to use an FPV camera as a backup for flying the drone, a lot of those use 2.4 GHz as well.

We actually ended up placing an omni-directional antenna on the drone itself, and a directional antenna on the ground station. I thought about using two directional antennas. Since the orientation of drones tends to move a lot, keeping the two directional antennas pointed at each other would be a difficult feat. I would have needed to setup at least a single axis servo for rotating the antenna, and then setup some small microcrontroller that would control the servo based on the GPS location of the ground station and the drone’s location. We figured that setup would have been a pain to get working alongside all the other work we had to do.

Getting a drone was sort of a PITA. First I had to explain to our guide that $4k wasn’t enough to buy a beefy base station laptop, a heavy-lift drone, transmission hardware, a camera, controls for flying the drone, a VR headset, and then have some spare for repairs and other incidental costs. I finally managed to bump things up to $5k, but not before DJI officially discontinued the S900 and the market for them dried up. Our school program’s policy was you needed to use an American retailer, meaning we couldn’t use the uber cheap drones from China. So I actually had to build this one up using a cheap heavy-lift drone frame (GAUI 950H) from a small American RC shop. Bought the frame, spent some time on GetFPV getting motors, props, ESCs, radios, PX4, the works. Once I had it assembled it worked great, and honestly I am glad I went with a PX4 based drone because it made things easier to configure and debug compared to using a NAZA or some other controller. So I was pretty sad when I had to hand it over to RIT =(

Article on CodeProject just came out. Already has 1.4K views. Strong interest in Jake Kenin’s cool project. :slight_smile:

https://www.codeproject.com/Articles/4622389/Stream-360-Video-to-VR-Headset-from-Drones-with-RI

2 Likes

I gave it a 5 out of 5 stars!

If you get a chance, read @codetricity’s article, and give it your rating or a comment.

2 Likes

@jcasman Chiming in here from the RIT Multidisciplinary Senior Design program! Like Jake said, we have students from biomedical, computer, electrical, industrial, and mechanical engineering collaborating on 2-semester team projects that come from industry, community organizations, students, and faculty.

This particular project was proposed and funded by Lockheed Martin Rotary & Mission Systems in Owego, NY, and was a collaborative effort by 5 students from computer, electrical, and mechanical engineering who were assigned to work on it. This team, like many others, posted their work publicly, and you can see details on what Jake and the rest of his team did here: http://edge.rit.edu/edge/P19123/public/Home

The team also created this 2-minute summary of their project, which gives you an overview of the entire system.

Lockheed is still working on this project with us, so it’s definitely not sitting on a shelf. In fact, they’re the ones who came across this post, in an effort to collect some additional background information to provide to the next group of students to work on this drone project.

I’ll let the rest of Jake’s team know that he shared this - I’m sure they’ll be excited to hear that their project was selected as “Project of the Month”!

@codetricity Glad to hear it’s gotten so much interest at CodeProject - I will pass this info on to the rest of the team, too!

2 Likes

Make that 5.4K views currently! :slight_smile:

Dr. DeBartolo,

This is extremely exciting to hear all these details and get a glimpse into some of the progress and efforts happening at RIT. Thanks so much for providing the details and links. I’m still reading through the information. If I come back with questions, hope you don’t mind me asking about details. Can you tell me more about the RIT Multidisciplinary Senior Design program? Is it required for undergrads? Do most/many seniors participate?

Jesse

Now at 5.4K views!

1 Like

I added this project to the RICOH THETA List of Apps. It’s a comprehensive - and growing quickly - catalog of all RICOH apps and devices.

This project is an incredible example of working with a THETA and extending what’s possible.

Thanks to @bethdt for information about the RIT Multidisciplinary Senior Design program. Sounds like a fantastic program for students, companies, and faculty.

https://theta360.guide/app/lockheed-amelia-drone

1 Like

So awesome Jake! I’d really like to boost the wifi connect on the ThetaV … it’s quite annoying!!!
Do you have a 3D print model you’d care to share to hold the Theta in Osmo Mobile as such?
Figure it would work on the first edition Osmo Mobile? How are you able to use it without it looking for a phone “attached” (by that little sensor it has) ?
Sincerely, Steve

1 Like

Last I heard from Jake, he was bicycling across the US. I asked another member of the Lockheed Amelia development team at RIT if they’re willing to share the model.

1 Like

After biking just shy of 4,000 miles in 3 months I made it from Boston to LA! From there I went to Japan for a month, and now I am back home and working in Cambridge MA. Actually while I was in Japan I went to the teamLab Borderless exhibit and I saw a person using ThetaV while walking through one of the exhibits.

@JapanaCana I did some digging and was able to find a CAD file for the attachment we placed the ThetaV in before sliding it in to the Omni’s mount. Here it is: https://srv-file5.gofile.io/download/CETFir/hw007.stp

If you want a copy of all the CAD files, you can find them here: https://drive.google.com/open?id=1w9wbZtcBJzu6pyzssEfHRkdJKiCMo-ng

All credit for the CAD files goes to Noah. He was a real trooper cranking all these out and taking our feedback and integrating it in to the design. And it did a great job protecting the important hardware when I crashed the drone lol.

3 Likes

This trip report is WAAAAAY more than I was expecting!

Cool that you saw a THETA in the wild! :3theta_s: