Solved: Unity Can't Display THETA V Live Stream on Windows 10

Hi all, I’ve gotten pretty far with this post, great work all, but streaming has this weird segmentation.

Any advice.


There’s a newer discussion at the link below.

Also, improved technique from @RicoB

Thanks Craig, the skybox worked for me. Just what I wanted. Thank you


@Christopher_Paton, thanks for reporting back on the success. It helps others.

The original Unity examples in this topic are old and I should reorganize the examples to help people find the updated information.

Let us know if you create something cool. :slight_smile:

Trying to set up telepresence robot operation.

Just trying to learn unity to get oculus quest controllers mapped.

You may find these articles interesting.

The examples below are not using Unity. The developers for the two projects below are on this forum. You may be able to exchange ideas on implementation.

You sir, are a legend. thank you.


1 Like

Hello everyone.

I am using Theta V for a project, wherein I want to get the live stream from the camera on the screen of a phone, such that it can be viewed in real time using a VR headset (so the picture format has to be such). Could anyone please help me out?

Wired or wireless, either type of communication works. The only requirement is that the lags must be minimized. I am looking for some existing apps or methods which could help me achieve the above stated aims.

Thank you.

You can test out the Lockheed Martin Amelia Drone app. View it in a browser and use A-Frame (included in the app) to set up the interface with the VR headset.

Otherwise, use the RTSP streaming method by @Hugues.

You can either run A-Frame in a browser or using a WebView (untested with the app) in your mobile project.

Thank you.

The Amelia Viewer app did work for getting the live stream, with very less latency. However, we did not understand how A-Frame is used. I would appreciate if you could help us out with some guide or documentation to set up the interface with the VR headset/mobile phone.

I also wanted to ask how the VR headset/mobile phone is connected to the Amelia Viewer app. Is it a wired connection (through a USB cable) or is there some other way to connect them?

There is extensive information here:

Thank you. The documentation was quite useful to understand how the interface with the VR headset is set.

However, we do not have an Oculus/Vive headset. So we are trying to look for a method which could help us do the same thing with any VR headset.

I would like to describe two tests that we conducted for the same.

Test 1: We tried the application that you had created for the project.

  • We were able to connect and view the livestream from Theta V on the PC.
  • We used Trinus VR software to view the livestream from the PC on the mobile device.
  • However, the head tracking didn’t work.
  • As we moved the phone around, the cursor on the PC moved, but the image didn’t. It is only when the cursor was clicked and dragged around, we could change the view of the image.
  • I wanted to ask if there a workaround for that. We would want the view of the image to change as the phone is moved around.

Test 2: We tried to connect the phone by using the local network of the PC as a server. The source repository can be found here.

  • The motion sensors could be accessed only over a secure connection, so we used a proxy server using https://
(The steps that we followed are enclosed here)
  1. Download the source repo from here.

  2. Open cmd.exe

  3. Run: cd Downloads/amelia_viewer-master (Assuming the folder is in the Downloads folder)

  4. Run: npm i

  5. Run: npm start

  6. ‘Ready for changes’ is displayed

  7. The app opens up

  8. Open another cmd.exe

  9. Run: npm i -g https-proxy

  10. Run: https-proxy --target=http://localhost:8181 --port=8182

  11. ’ Proxing to “http://localhost:8181” through “htpps://localhost:8182” ’ is displayed

  12. Open Firefox on the PC and go to

  13. Open the mobile browser

  14. Open https://a.b.c.d:8182 (Replace it with your IP address)

  • Head tracking worked perfectly here.
  • However, when Theta V is connected to the PC, or when a local video is played on the PC, we could not view it on the mobile device.
  • We could only see the first still image(the example) on the phone.
  • We think that there is some part of the code that points to the example image and doesn’t change as the video is updated. However, we cannot understand what is causing this issue exactly.

There was an issue with the local video playback, which has also been attempted to be fixed in the source code that I have attached. But the loading timeout is 3000ms, so large videos could not be loaded. Increasing the timeout value also didn’t work for loading large videos. However, this is not a concern for us at the moment. I mentioned it because we were not sure if it is related to what is causing the issue with the livestream.

Could you please help us out here?

Thank you. Have a great day!

Hi everyone,
its 2020 and I feel like going back in time while not being able to get rid of the “Could not connect pins” error in unity.

I use Windows 10 with Unity 2020.1. I have updated the Ricoh Theta V to the latest firmware(3.50.1) and installed the current UVC blender driver.

Now I successfully managed to livestream to YouTube via OBS and also Unity recognizes the Theta.
However using the script attached to the sphere (as above and according to the unity documentation still up-to-date):

    Renderer renderer = this.GetComponentInChildren<Renderer>();
    WebCamTexture mycam = new WebCamTexture("RICOH THETA V/Z1 4K");
    renderer.material.mainTexture = mycam;

gives me the error this whole thread started with:
“Could not connect pins - RenderStream()”

Anyone recently experienced the same? Is there any updated workaround?

Are you using the updated driver? Lower right-hand corner of table. It is below the table.


Hello @craig!

I have gone through the whole thread but I 've been experiencing the same problem as @Rob_Erta. I have installed the current Live-streaming app for Windows (RICOH THETA UVC V/Z1, v2.0.0).
OBS live-streaming works fine. Can there be a workaround?

1 Like

Hi, thanks for posting your question. It’s great to see Unity development still going strong. :slight_smile:

I need to reinstall Windows 10 due to a hard disk problem and then I’ll run this again.

Can you post what version of the camera firmware, Unity, and Windows 10 you are using so that we have baseline information that might help trace the problem?

I am using Theta V with the latest firmware (3.50.1), Windows 10 Home edition and Unity Version 2019.4.6f1

Also @craig,
Would it be possible to give an answer to this comment of mine on Wireless streaming?

Sorry for asking so many things at once,

Hi @dimdallas,
I just tested some setup I was working on:
Windows 10 (20H2, build 19042.746) with WSL2 (windows subsystem for linux) with Ubuntu 18.04 LTS.
I installed NGINX (with rtmp module) on the Ubuntu (running in WSL2), example config, and on that same Windows machine watched the stream with ffmpeg/ffplay:
ffplay -fflags nobuffer -i "rtmp://"
z1 -> stream key.

The ffmpeg/fflpay works also for the “THETA RTSP Streaming Plug-In”.

Since this thread is about Unity, one could theoretically use FFMPEG as an plugin (dll) in Unity and project/pipe ffmpeg/ffplay output on a texture.


Hi @pstawicki,
I just tried the THETA RTSP Streaming Plug-In and I successfully viewed the stream with ffplay -fflags nobuffer -i rtsp://192.168.XXX.XXX:8554/live?resolution=1920x960

But I search for a solution for FFMPEG projection in Unity. Final target is either projecting on a inside-out sphere, or even better projecting on Skybox. I can’t find something for this problem, but I will keep on searching. If you have anything in mind, it will be usefull.

1 Like