Solved: Unity Can't Display THETA V Live Stream on Windows 10

You sir, are a legend. thank you.


1 Like

Hello everyone.

I am using Theta V for a project, wherein I want to get the live stream from the camera on the screen of a phone, such that it can be viewed in real time using a VR headset (so the picture format has to be such). Could anyone please help me out?

Wired or wireless, either type of communication works. The only requirement is that the lags must be minimized. I am looking for some existing apps or methods which could help me achieve the above stated aims.

Thank you.

You can test out the Lockheed Martin Amelia Drone app. View it in a browser and use A-Frame (included in the app) to set up the interface with the VR headset.

Otherwise, use the RTSP streaming method by @Hugues.

You can either run A-Frame in a browser or using a WebView (untested with the app) in your mobile project.

Thank you.

The Amelia Viewer app did work for getting the live stream, with very less latency. However, we did not understand how A-Frame is used. I would appreciate if you could help us out with some guide or documentation to set up the interface with the VR headset/mobile phone.

I also wanted to ask how the VR headset/mobile phone is connected to the Amelia Viewer app. Is it a wired connection (through a USB cable) or is there some other way to connect them?

There is extensive information here:

Thank you. The documentation was quite useful to understand how the interface with the VR headset is set.

However, we do not have an Oculus/Vive headset. So we are trying to look for a method which could help us do the same thing with any VR headset.

I would like to describe two tests that we conducted for the same.

Test 1: We tried the application that you had created for the project.

  • We were able to connect and view the livestream from Theta V on the PC.
  • We used Trinus VR software to view the livestream from the PC on the mobile device.
  • However, the head tracking didn’t work.
  • As we moved the phone around, the cursor on the PC moved, but the image didn’t. It is only when the cursor was clicked and dragged around, we could change the view of the image.
  • I wanted to ask if there a workaround for that. We would want the view of the image to change as the phone is moved around.

Test 2: We tried to connect the phone by using the local network of the PC as a server. The source repository can be found here.

  • The motion sensors could be accessed only over a secure connection, so we used a proxy server using https://
(The steps that we followed are enclosed here)
  1. Download the source repo from here.

  2. Open cmd.exe

  3. Run: cd Downloads/amelia_viewer-master (Assuming the folder is in the Downloads folder)

  4. Run: npm i

  5. Run: npm start

  6. ‘Ready for changes’ is displayed

  7. The app opens up

  8. Open another cmd.exe

  9. Run: npm i -g https-proxy

  10. Run: https-proxy --target=http://localhost:8181 --port=8182

  11. ’ Proxing to “http://localhost:8181” through “htpps://localhost:8182” ’ is displayed

  12. Open Firefox on the PC and go to

  13. Open the mobile browser

  14. Open https://a.b.c.d:8182 (Replace it with your IP address)

  • Head tracking worked perfectly here.
  • However, when Theta V is connected to the PC, or when a local video is played on the PC, we could not view it on the mobile device.
  • We could only see the first still image(the example) on the phone.
  • We think that there is some part of the code that points to the example image and doesn’t change as the video is updated. However, we cannot understand what is causing this issue exactly.

There was an issue with the local video playback, which has also been attempted to be fixed in the source code that I have attached. But the loading timeout is 3000ms, so large videos could not be loaded. Increasing the timeout value also didn’t work for loading large videos. However, this is not a concern for us at the moment. I mentioned it because we were not sure if it is related to what is causing the issue with the livestream.

Could you please help us out here?

Thank you. Have a great day!

Hi everyone,
its 2020 and I feel like going back in time while not being able to get rid of the “Could not connect pins” error in unity.

I use Windows 10 with Unity 2020.1. I have updated the Ricoh Theta V to the latest firmware(3.50.1) and installed the current UVC blender driver.

Now I successfully managed to livestream to YouTube via OBS and also Unity recognizes the Theta.
However using the script attached to the sphere (as above and according to the unity documentation still up-to-date):

    Renderer renderer = this.GetComponentInChildren<Renderer>();
    WebCamTexture mycam = new WebCamTexture("RICOH THETA V/Z1 4K");
    renderer.material.mainTexture = mycam;

gives me the error this whole thread started with:
“Could not connect pins - RenderStream()”

Anyone recently experienced the same? Is there any updated workaround?

Are you using the updated driver? Lower right-hand corner of table. It is below the table.


Hello @craig!

I have gone through the whole thread but I 've been experiencing the same problem as @Rob_Erta. I have installed the current Live-streaming app for Windows (RICOH THETA UVC V/Z1, v2.0.0).
OBS live-streaming works fine. Can there be a workaround?

1 Like

Hi, thanks for posting your question. It’s great to see Unity development still going strong. :slight_smile:

I need to reinstall Windows 10 due to a hard disk problem and then I’ll run this again.

Can you post what version of the camera firmware, Unity, and Windows 10 you are using so that we have baseline information that might help trace the problem?

I am using Theta V with the latest firmware (3.50.1), Windows 10 Home edition and Unity Version 2019.4.6f1

Also @craig,
Would it be possible to give an answer to this comment of mine on Wireless streaming?

Sorry for asking so many things at once,

Hi @dimdallas,
I just tested some setup I was working on:
Windows 10 (20H2, build 19042.746) with WSL2 (windows subsystem for linux) with Ubuntu 18.04 LTS.
I installed NGINX (with rtmp module) on the Ubuntu (running in WSL2), example config, and on that same Windows machine watched the stream with ffmpeg/ffplay:
ffplay -fflags nobuffer -i "rtmp://"
z1 -> stream key.

The ffmpeg/fflpay works also for the “THETA RTSP Streaming Plug-In”.

Since this thread is about Unity, one could theoretically use FFMPEG as an plugin (dll) in Unity and project/pipe ffmpeg/ffplay output on a texture.


Hi @pstawicki,
I just tried the THETA RTSP Streaming Plug-In and I successfully viewed the stream with ffplay -fflags nobuffer -i rtsp://192.168.XXX.XXX:8554/live?resolution=1920x960

But I search for a solution for FFMPEG projection in Unity. Final target is either projecting on a inside-out sphere, or even better projecting on Skybox. I can’t find something for this problem, but I will keep on searching. If you have anything in mind, it will be usefull.

1 Like

See this project by @Hugues

it’s possible that the Janus WebRTC server is what you’re looking for.

the solution by @pstawicki using the RTMP module for NGINX looks like a great solution too.

sorry for the huge delay. I was quite frustrated and put off. So indeed I used the link as described.
To be sure its not a different problem, I checked and confirmed that the YouTube LiveStream via OBSStudio works. So in principle my ThetaV allows live streaming, but it remains stubborn when it comes to Unity, even after adding DevicePath to the registry.

This is how the registry looks like:

Also I recorded a video of my workflow using the Theta in Live Video mode, maybe I am missing something …

I use the latest firmware on theta: 3.15.1
And Windows 10, 10.0.19042

1 Like

Uninstall the driver you are currently using and install the older driver. See the link below. We tested it with a V as well.

1 Like

That was it! Big thanks for the video. Changes and development never stop. I am superhappy to start experimenting with Theta V in VR now.


I have stuck with live streaming setting in Unity. I installed RICOH THETA UVC Driver 3.0.0 and tried live streaming. It works in OBS but not in Unity. Unity says “Couldn’t find a supported mediatype for the selected Virtual Camera”. I need a help!

Here is script that I’m using.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class ThetaV : MonoBehaviour
    void Start()
        WebCamDevice[] devices = WebCamTexture.devices;
        Debug.Log("number of web cams connected:" + devices.Length);

        for (int i = 0; i < devices.Length; i++){
            string camName = devices[i].name;
            Debug.Log("webcam device "+ i +" is " + camName);

        Renderer rend = this.GetComponentInChildren<Renderer>();

        WebCamTexture mycam = new WebCamTexture();
        string thetaName = devices[2].name;
        Debug.Log("The webcam name is " + thetaName);
        mycam.deviceName = thetaName;
        rend.material.mainTexture = mycam;


    // Update is called once per frame
    void Update()


Firmware 3.82.1
RICOH THETA UVC Driver 3.0.0
Unity 2022.3.0

Sorry for my weird English. I’m not native speaker.

@S_IWAKI Have you tried posting in the Unity forum? Since this sounds like (possibly) a Unity problem, you may get help there.

I have posted here in the past, the AR/VR (XR) Discussion part of the Unity forum: