Tutorial: Live Ricoh Theta S Dual Fish Eye for SteamVR in Unity

I’ve been working on a system for robotic control using the HTC Vive for a few months now. Part of this system involved live streaming 360 feed into a virtual reality headset, which was easier said than done. There are a lot of instructions/tutorials/guides out there but nothing that completely does what I wanted to do, so I figured I would write this up for anybody who wanted to do something similar using SteamVR or other VR API’s in Unity. I’ll update this post when my solution is posted to github with the link!

This solution provided me with the basis to move forward with, but left a lot of questions unanswered as they did not actually implement it with the live feed. This solution gave me the basis for getting a video onto the scene, but as most of us have found out the hard way, the UVC blender does not work in unity. Meaning in Unity, you can only use the dual fish eye feed, so you need to incorporate parts from both of the above solutions. This solution also works for the feed you get from a hdmi to usb capture card, as it is also dual fish eyed. You could also adapt this solution to work with Oculus.

Note: I’m using Unity version 5.5.0f3 and the SteamVR assets version 1.2.0 (latest as of this post). You should try to save frequently when working with unity as it can crash (a lot).

Download this unity package from this blog post. Open a new project in unity and import the package and the assets SteamVR.

Delete the main camera

Add the SteamVR and CameraRig Prefabs to the scene. The CameraRig should be set to x:0,y:1,z:-10, and go ahead and unselect both hand controllers as we wont be using them for this project.

Now you can add the theta-sphere prefab from the assets of the package from the blog post

The sphere prefab has most of the things you want on it, however, I noticed when streaming from the USB, there are big gaps on the sphere where the stitching isn’t great. We will fix that later (as best we can).

You’ll need to now make a script for projecting the camera feed onto the two spheres, this script is based off of the script from the second post I mentioned. Though, rather than taking an equirectangular form onto one sphere, you need to project each lens’s feed onto their respective spheres (because as of this post, Unity does not recognize the UVC blender as a webcam).

You will need two scripts, one for the webcam to be projected, and another to play it

For the webcam projection I made WebcamDualEyed, create a script and copy the code below.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class WebcamDualEyed : MonoBehaviour {

    public GameObject sphere1;
    //Second Camera Added by Meg
    public GameObject sphere2; 
    public int cameraNumber = 0;

    private WebCamTexture webCamTexture;
    private WebCamDevice webCamDevice;

    void Start()
        // Checks how many and which cameras are available on the device
        for (int cameraIndex = 0; cameraIndex < WebCamTexture.devices.Length; cameraIndex++)
            Debug.Log("WebCamDevice " + cameraIndex + " name " + WebCamTexture.devices[cameraIndex].name);

        if (WebCamTexture.devices.Length > cameraNumber)
            webCamDevice = WebCamTexture.devices[cameraNumber];
            webCamTexture = new WebCamTexture(webCamDevice.name, 1280, 720);
            sphere1.GetComponent<Renderer>().material.mainTexture = webCamTexture;
            //Added by Meg
            sphere2.GetComponent<Renderer>().material.mainTexture = webCamTexture; 
        else {
            Debug.Log("no camera");

To play the feed I made a script called VideoPlay, create your own and copy the code blow.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class VideoPlay : MonoBehaviour {

    public GameObject Sphere1;
    public GameObject Sphere2;

    void Start()
        MovieTexture video360_1 = (MovieTexture)Sphere1.GetComponent<Renderer>().material.mainTexture;
        MovieTexture video360_2 = (MovieTexture)Sphere2.GetComponent<Renderer>().material.mainTexture;

        video360_1.loop = true;
        video360_2.loop = true;


Now create an empty game object and attach the scripts onto the object

Now you want to set the variables for sphere 1 and sphere 2, select either sphere from the hierarchy and drag it to the variable slots for both scripts.

Now you’re going to want to plug in your theta and start live streaming mode.
Set the camera number on the object to whichever number corresponds to the raw Ricoh Theta S camera feed. Mine happens to correspond to the number 1, which I’ve already set in the image above. When you go to play the scene, the console will show you which webcam corresponds to each number.

Now, you’re going to want to change the render queue of both sphere’s texture to transparent.

Now, plug in your vive, start up steamVR, and press play.

When you change from “Scene” to “Game” the feed should display in your headset!

You should be able to see most of the field of view, there are some big black chunks missing though.

This can be a little disorienting for virtual reality!!

I haven’t had the chance yet to go into maya to recreate the template that the shader uses, but I’ve made some manual adjustments to make it less awkward and close up the holes a little bit (Its not perfect but its better than the holes! If anybody figures out a better way please let me know).

Go to the shader for each sphere and change the values to the following :

Offset U: 0.013
Offset V: 0.007
Scale U: 0.983
Scale V: 1.149
Scale Center U: 0.0686
Scale Center V: 0.5

Offset U: -0.01
Offset V: -0.031
Scale U: 0.976
Scale V: 0.958
Scale Center U: 0.26
Scale Center V: 0.55

This should close up the holes pretty well, and now you can view your 360 live feed using VR!

Let me know if you have any questions! (Didn’t quite get to capture the inception happening with the camera :frowning: )


Here’s a bit more information from interviewing the developer, Megan Zimmerman.

Q: Is it possible to get the SteamVR assets in the Unity app to work with another headset?

A: For this solution, since I was working with the Vive, I included the prefab that works with the Vive. SteamVR works with Oculus as well. There is a little bit of external setup to enable unknown sources for the Oculus. Additionally, you could use the Oculus utilities for Unity (at this link https://developer3.oculus.com/downloads/) instead of SteamVR.

Q: Do you have advice for people streaming from HDMI? What’s your setup?

A: I have used both, currently I am using the HDMI output with a Feelworld HDMI to USB 3.0 Capture card which works great.

Q: Do you intend to put your project up on GitHub or other public repo?

A: I plan to publish it through GitHub. I’m in the process of filling out some paperwork that the government agency I work for requires. I’ll let the community know when it goes up.


We’re featuring this tutorial and the developer on our main site page now as the Developer of the Month.

Problems this tutorial solves:

  • Provides a path forward for developers that want to get the THETA live stream inside of Unity since UVC Blender and UVC FullHD Blender do not show up in Unity. (If someone gets it to show up, tell us how)
  • Provides a possible solution to developers that want to use the HDMI output (which is in dual-fisheye)
  • First open source example of using SteamVR components to stream to headsets, with a HTC Vive example and a path for Oculus
  • Extends functionality of hecomi and GOROman’s tutorial and shader pack.

Previous Developers of Month

Photosphere was the first solid real-world use the THETA USB API. Great source code, documentation and parts list.

tlapser360 took the WiFi and USB API and added external sensors like GPS and light to adjust the camera in addition to taking pictures. Full source code and documentation available.


Help Installing SteamVR

This is my first time using SteamVR. I’m on Windows 10, Unity 5.5.0f3, 64bit, personal. I do not have a headset. I want to test the tutorial on my desktop.

I get the error below:

VR: OpenVR Error! OpenVR failed initialization with error code VRInitError_Init_PathRegistryNotFound: "Installation path could not be located (110)"!

I may have imported the assets incorrectly.

From inside of Unity, I imported all the Plugins and rebooted my machine.

I read through these threads. My username is craig on my Win10 machine.

Anyone have any thoughts?

Possible Solution with Unity finding OpenVR (SteamVR) libs

Feeling pretty good now that I added the SteamVR Plugin at time of New Project creation. Unity is finding the libraries. :tada:

Blank project looks more promising than my previous attempt this morning. Will leave my previous post up as it may help other people. Going through rest of tutorial now.

Live streaming, navigation, and the manual blending of the spheres now working!

The section below was the critical insight for me. Perhaps other people were struggling with this in the previous blog post as well?

You’ll need to now make a script for projecting the camera feed onto the two spheres, this script is based off of the script from the second post I mentioned. Though, rather than taking an equirectangular form onto one sphere, you need to project each lens’s feed onto their respective spheres (because as of this post, Unity does not recognize the UVC blender as a webcam).

I’m kind of wondering how I missed that from the older post by Hecomi.

Now here’s the question. Do I buy a Vive for $800?

That smile in the screenshot of when you got it working is like pure joy. :theta_s:

If anyone bites the bullet and buys the Vive, it looks pretty easy to develop basic applications for it. See this tutorial below that covers the controllers that were disabled in the tutorial above.

That guy has a bunch of tutorials, including this for interacting with objects.

Although it looks cool and fun, not sure if I have time to get my $800 worth of fun to justify the Vive especially since I don’t play PC games

1 Like

The dual projection is implied in Hecomi’s post as he adds the camera feed script to both spheres for their equirectangular solution. In the Tanyuan solution, you have equirectangular already and are projecting one feed onto one sphere, because we are using two spheres we project onto both.

Personally, when I started working on this solution I got caught up in Hecomi’s solution because its harder to understand what is going on if you don’t do it yourself step by step. This has me thinking that I should probably make my own shader from scratch, because while I get the idea of what the shader is doing, I want to understand how.

Thanks for adding the fix about steamVR I forgot to mention exactly how setup works because my Unity sets itself up automatically for VR. Whoops!

Also you should totally buy a vive :slight_smile:

Hey, the link to the package is no longer working. Any chance you can share the package some other way?

Are you looking for these?

If not, let me know which packages you’re looking for.

have a nice day.

1 Like

The dropbox link is this one:

Thanks a lot! I tried importing that package but got the error message: couldn’t decompress package. Any idea why that happened?

Nevermind got it working :slight_smile:

1 Like

Great news! Hey, I’d be interested in learning what you’re doing with live streaming.

Also, do you know that you can get UVC FullHD Blender working in Unity as well with the hack below? People are doing some cool things with Unity and the THETA. Always exciting to learn about new projects.

Hi! I was just wondering what the oculus equivalent of the steamvr prefab is.

Are you using STEAM VR with Oculus Rift?

Or, are you using the Oculus SDK?

Do you have a THETA V or a THETA S?

Thank you, but I just figured it out. I was using the Oculus SDK with a Theta V.

However, now I have a new problem. The Theta V does not display anything. I know that the code works because I am able to use the webcam in the computer, however, whenever I use the Theta V, I just see the default scene.

On another note, I made a script that prints the device names in the console, and the Theta V shows up three times as different webcams. (1) Theta V, (2) Theta V FullHd, and (3) Theta V 4K. Is this normal?

Is there an SDK that runs on Mac? There’s some problems using Unity with the THETA V live streaming.