You can test out the Lockheed Martin Amelia Drone app. View it in a browser and use A-Frame (included in the app) to set up the interface with the VR headset.
The Amelia Viewer app did work for getting the live stream, with very less latency. However, we did not understand how A-Frame is used. I would appreciate if you could help us out with some guide or documentation to set up the interface with the VR headset/mobile phone.
I also wanted to ask how the VR headset/mobile phone is connected to the Amelia Viewer app. Is it a wired connection (through a USB cable) or is there some other way to connect them?
Thank you. The documentation was quite useful to understand how the interface with the VR headset is set.
However, we do not have an Oculus/Vive headset. So we are trying to look for a method which could help us do the same thing with any VR headset.
I would like to describe two tests that we conducted for the same.
Test 1: We tried the application that you had created for the project.
We were able to connect and view the livestream from Theta V on the PC.
We used Trinus VR software to view the livestream from the PC on the mobile device.
However, the head tracking didn’t work.
As we moved the phone around, the cursor on the PC moved, but the image didn’t. It is only when the cursor was clicked and dragged around, we could change the view of the image.
I wanted to ask if there a workaround for that. We would want the view of the image to change as the phone is moved around.
Test 2: We tried to connect the phone by using the local network of the PC as a server. The source repository can be found here.
The motion sensors could be accessed only over a secure connection, so we used a proxy server using https://
However, when Theta V is connected to the PC, or when a local video is played on the PC, we could not view it on the mobile device.
We could only see the first still image(the example) on the phone.
We think that there is some part of the code that points to the example image and doesn’t change as the video is updated. However, we cannot understand what is causing this issue exactly.
There was an issue with the local video playback, which has also been attempted to be fixed in the source code that I have attached. But the loading timeout is 3000ms, so large videos could not be loaded. Increasing the timeout value also didn’t work for loading large videos. However, this is not a concern for us at the moment. I mentioned it because we were not sure if it is related to what is causing the issue with the livestream.
Hi everyone,
its 2020 and I feel like going back in time while not being able to get rid of the “Could not connect pins” error in unity.
I use Windows 10 with Unity 2020.1. I have updated the Ricoh Theta V to the latest firmware(3.50.1) and installed the current UVC blender driver.
Now I successfully managed to livestream to YouTube via OBS and also Unity recognizes the Theta.
However using the script attached to the sphere (as above and according to the unity documentation still up-to-date):
I have gone through the whole thread but I 've been experiencing the same problem as @Rob_Erta. I have installed the current Live-streaming app for Windows (RICOH THETA UVC V/Z1, v2.0.0).
OBS live-streaming works fine. Can there be a workaround?
Hi, thanks for posting your question. It’s great to see Unity development still going strong.
I need to reinstall Windows 10 due to a hard disk problem and then I’ll run this again.
Can you post what version of the camera firmware, Unity, and Windows 10 you are using so that we have baseline information that might help trace the problem?
Hi @dimdallas,
I just tested some setup I was working on:
Windows 10 (20H2, build 19042.746) with WSL2 (windows subsystem for linux) with Ubuntu 18.04 LTS.
I installed NGINX (with rtmp module) on the Ubuntu (running in WSL2), example config, and on that same Windows machine watched the stream with ffmpeg/ffplay: ffplay -fflags nobuffer -i "rtmp://192.168.2.118/live/z1"
z1 -> stream key.
The ffmpeg/fflpay works also for the “THETA RTSP Streaming Plug-In”.
Since this thread is about Unity, one could theoretically use FFMPEG as an plugin (dll) in Unity and project/pipe ffmpeg/ffplay output on a texture.
Piotr
Hi @pstawicki,
I just tried the THETA RTSP Streaming Plug-In and I successfully viewed the stream with ffplay -fflags nobuffer -i rtsp://192.168.XXX.XXX:8554/live?resolution=1920x960
But I search for a solution for FFMPEG projection in Unity. Final target is either projecting on a inside-out sphere, or even better projecting on Skybox. I can’t find something for this problem, but I will keep on searching. If you have anything in mind, it will be usefull.
Hi,
sorry for the huge delay. I was quite frustrated and put off. So indeed I used the link as described.
To be sure its not a different problem, I checked and confirmed that the YouTube LiveStream via OBSStudio works. So in principle my ThetaV allows live streaming, but it remains stubborn when it comes to Unity, even after adding DevicePath to the registry.
Hello,
I have stuck with live streaming setting in Unity. I installed RICOH THETA UVC Driver 3.0.0 and tried live streaming. It works in OBS but not in Unity. Unity says “Couldn’t find a supported mediatype for the selected Virtual Camera”. I need a help!
Here is script that I’m using.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class ThetaV : MonoBehaviour
{
void Start()
{
WebCamDevice[] devices = WebCamTexture.devices;
Debug.Log("number of web cams connected:" + devices.Length);
for (int i = 0; i < devices.Length; i++){
string camName = devices[i].name;
Debug.Log("webcam device "+ i +" is " + camName);
}
Renderer rend = this.GetComponentInChildren<Renderer>();
WebCamTexture mycam = new WebCamTexture();
string thetaName = devices[2].name;
Debug.Log("The webcam name is " + thetaName);
mycam.deviceName = thetaName;
rend.material.mainTexture = mycam;
mycam.Play();
}
// Update is called once per frame
void Update()
{
}
}