Hi all, I’ve gotten pretty far with this post, great work all, but streaming has this weird segmentation.
Any advice.
Thanks
Chris
Hi all, I’ve gotten pretty far with this post, great work all, but streaming has this weird segmentation.
Any advice.
Thanks
Chris
Thanks Craig, the skybox worked for me. Just what I wanted. Thank you
@Christopher_Paton, thanks for reporting back on the success. It helps others.
The original Unity examples in this topic are old and I should reorganize the examples to help people find the updated information.
Let us know if you create something cool.
Trying to set up telepresence robot operation.
Just trying to learn unity to get oculus quest controllers mapped.
You may find these articles interesting.
The examples below are not using Unity. The developers for the two projects below are on this forum. You may be able to exchange ideas on implementation.
You sir, are a legend. thank you.
Chris
Hello everyone.
I am using Theta V for a project, wherein I want to get the live stream from the camera on the screen of a phone, such that it can be viewed in real time using a VR headset (so the picture format has to be such). Could anyone please help me out?
Wired or wireless, either type of communication works. The only requirement is that the lags must be minimized. I am looking for some existing apps or methods which could help me achieve the above stated aims.
Thank you.
You can test out the Lockheed Martin Amelia Drone app. View it in a browser and use A-Frame (included in the app) to set up the interface with the VR headset.
Otherwise, use the RTSP streaming method by @Hugues.
You can either run A-Frame in a browser or using a WebView (untested with the app) in your mobile project.
Thank you.
The Amelia Viewer app did work for getting the live stream, with very less latency. However, we did not understand how A-Frame is used. I would appreciate if you could help us out with some guide or documentation to set up the interface with the VR headset/mobile phone.
I also wanted to ask how the VR headset/mobile phone is connected to the Amelia Viewer app. Is it a wired connection (through a USB cable) or is there some other way to connect them?
Thank you. The documentation was quite useful to understand how the interface with the VR headset is set.
However, we do not have an Oculus/Vive headset. So we are trying to look for a method which could help us do the same thing with any VR headset.
I would like to describe two tests that we conducted for the same.
Test 1: We tried the application that you had created for the project.
Test 2: We tried to connect the phone by using the local network of the PC as a server. The source repository can be found here.
Download the source repo from here.
Open cmd.exe
Run: cd Downloads/amelia_viewer-master (Assuming the folder is in the Downloads folder)
Run: npm i
Run: npm start
‘Ready for changes’ is displayed
The app opens up
Open another cmd.exe
Run: npm i -g https-proxy
Run: https-proxy --target=http://localhost:8181 --port=8182
’ Proxing to “http://localhost:8181” through “htpps://localhost:8182” ’ is displayed
Open Firefox on the PC and go to http://127.0.0.1:8181/
Open the mobile browser
Open https://a.b.c.d:8182 (Replace it with your IP address)
There was an issue with the local video playback, which has also been attempted to be fixed in the source code that I have attached. But the loading timeout is 3000ms, so large videos could not be loaded. Increasing the timeout value also didn’t work for loading large videos. However, this is not a concern for us at the moment. I mentioned it because we were not sure if it is related to what is causing the issue with the livestream.
Could you please help us out here?
Thank you. Have a great day!
Hi everyone,
its 2020 and I feel like going back in time while not being able to get rid of the “Could not connect pins” error in unity.
I use Windows 10 with Unity 2020.1. I have updated the Ricoh Theta V to the latest firmware(3.50.1) and installed the current UVC blender driver.
Now I successfully managed to livestream to YouTube via OBS and also Unity recognizes the Theta.
However using the script attached to the sphere (as above and according to the unity documentation still up-to-date):
Renderer renderer = this.GetComponentInChildren<Renderer>();
WebCamTexture mycam = new WebCamTexture("RICOH THETA V/Z1 4K");
renderer.material.mainTexture = mycam;
mycam.Play();
gives me the error this whole thread started with:
“Could not connect pins - RenderStream()”
Anyone recently experienced the same? Is there any updated workaround?
Are you using the updated driver? Lower right-hand corner of table. It is below the table.
Hello @craig!
I have gone through the whole thread but I 've been experiencing the same problem as @Rob_Erta. I have installed the current Live-streaming app for Windows (RICOH THETA UVC V/Z1, v2.0.0).
OBS live-streaming works fine. Can there be a workaround?
Hi, thanks for posting your question. It’s great to see Unity development still going strong.
I need to reinstall Windows 10 due to a hard disk problem and then I’ll run this again.
Can you post what version of the camera firmware, Unity, and Windows 10 you are using so that we have baseline information that might help trace the problem?
I am using Theta V with the latest firmware (3.50.1), Windows 10 Home edition and Unity Version 2019.4.6f1
Also @craig,
Would it be possible to give an answer to this comment of mine on Wireless streaming?
Sorry for asking so many things at once,
DImitris
Hi @dimdallas,
I just tested some setup I was working on:
Windows 10 (20H2, build 19042.746) with WSL2 (windows subsystem for linux) with Ubuntu 18.04 LTS.
I installed NGINX (with rtmp module) on the Ubuntu (running in WSL2), example config, and on that same Windows machine watched the stream with ffmpeg/ffplay:
ffplay -fflags nobuffer -i "rtmp://192.168.2.118/live/z1"
z1 -> stream key.
The ffmpeg/fflpay works also for the “THETA RTSP Streaming Plug-In”.
Since this thread is about Unity, one could theoretically use FFMPEG as an plugin (dll) in Unity and project/pipe ffmpeg/ffplay output on a texture.
Piotr
Hi @pstawicki,
I just tried the THETA RTSP Streaming Plug-In and I successfully viewed the stream with ffplay -fflags nobuffer -i rtsp://192.168.XXX.XXX:8554/live?resolution=1920x960
But I search for a solution for FFMPEG projection in Unity. Final target is either projecting on a inside-out sphere, or even better projecting on Skybox. I can’t find something for this problem, but I will keep on searching. If you have anything in mind, it will be usefull.