Can RICOH THETA Z1 be used in Unity as a webcam?

I spent some time lately on backend server and to cover live streaming to Oculus/Meta Rift, Quest, Quest 2/3, HTC Vive HMD’s. It works, but work is still in progress.

Would be great to have a Ricoh Theta Z2, capable of 8k live streaming/in camera stitching. :slight_smile: With Z1 it’s also a great experience.

Searching for best infrastructure to cover the globe, testing various CDN’s, AWS network/transfer services. Will have a server up and running in US too soon.

Btw. do you or Greg own a Meta Quest 2 or 3? :wink:

Wow, this seems like a big upgrade.

We do not have a Meta Quest 2 or 3 right now. It seems like the Meta Quest 2 is under $300, which is within reach.

Do you foresee use of the Meta Quest 2 or 3 in entertainment, experiential sales (like house tours), inspection, telepresence from a drone?

yes, absolutely! Clearly extra low latency requirement will only be there in local network for now, controlling a drone may work with ~200ms latency, but not when ~2 second latency is in place. 4k resolution is still a limit in this case, both for 360 cams and for delivering to viewers. For better quality ~4 second latency provides better image quality but still some interactivity can be achieved with the presenter.

There are some limitations how 360 video is rendered inside these VR devices , so some tweaking will be required there too, like I did with Z1. Will hopefully get there this year.

1 Like

Hi, I’m currently in the process of putting my camera into dev mode. I would like if you shared the apk directly with me, if you can’t then it’s fine as the update will be out in a few days anyway.


Here is two documents detailing our goals for our project, essentially we aim to setup a Ricoh Theta Z1 camera to wirelessly stream to a VR Headset (Meta Quest 2) with as little latency as possible. The camera will watch over a robotic arm which will be controlled by a motion sensor that tracks the motion of the user’s hand and sends it to the robotic arm so that it will mimic the user’s hand.
Remote_Robot_Control_with_Low-cost_Robotic_Arms_and_Human_Motions (5).pdf (798.8 KB)
Demo_I_Am_a_Robot_First-person_Robotic_Arm_Control_with_Hand_Motions_in_Virtual_Reality (4).pdf (6.6 MB)

1 Like

Are you streaming the camera into the headset without a computer between the camera and the headset?

@biviel , how are you getting the RTSP stream into the Quest 2 headset? Are using LibVLCSharp for Unity?

Maybe it will just work…

It will be weeks and when plugin gets there to plugin store, it may take me more time to settle all those options and not planning to expose all functionalities at once. I’ve control over it. My website generates a JSON configuration data feed and via web UI not all options are exposed to public, going stpe by step. Lot of parameters are there to adjust and no graphical UI for each yes. It will take more time.

Sure glad to share with you directly in first place if it helps your project.

In Meta Quest I’m testing my streams by using inbuilt browser and watching on website directly my stream. So not directly, on Meta Quest I’m not aware of any app that could directly ingest and play an incoming RTSP or SRT stream. I was thinking to create one, but I decided to try support via browser.

@HunterGeitz ,
how are you going to watch the stream on Meta Quest 2? What is your plan there with OBS Studio and connecting it with Quest 2?

Is it difficult to support webrtc from your plugin?

WebRTC | WebRTC | 2.4.0-exp.11

Once the video is in the browser inside the headset, is it possible to use pre-built javascript libraries for the navigation?

Remember the Amelia drone project?

Successful Theta V stream from drone to VR headset 0.25 miles away - #27 by Jake_Kenin

The project used motionjpeg and the browser inside the VR headset.

I would recommend you write your own Theta V plugin that establishes a WebRTC connection with the base station. WebRTC is a P2P media streaming standard that was designed for real time communication, and uses H.264 (which the Snapdragon 625 has hardware encoding for) for video encoding. It is well documented and is your best bet for low latency video streaming at high resolution and framerate. It is sort of complex and will take some time, but it would be well worth it. Here are some examples of WebRTC.

does this livepreview work with z1 too? GitHub - Oppkey/thetax_live_preview: RICOH THETA X Live Preview Demo

It works with the Z1.


there’s a better example here:

GitHub - ricohapi/theta-client: A library to control RICOH THETA

However, it won’t run on the desktop.

The version only shows equirectangular. However, I know how to modify it for the 360 navigation.

@biviel , I added an example of the 360 view with a toggle in the top appbar. I tested with the Z1.


It was difficult if want to keep high quality, when I last time looked into several years back. I tried using native cpp libraries and functions to use for best performance, but it may not be the only way.

I remember, but not in detail at all.

It used motionjpeg from the livepreview of the V. Resolution was limited to
1920x1080 JPEGs @ 15FPS. However, the specs of the camera only go up to 1920x1080 at 8fps.

They used a Ubiquiti NanoBeam 5AC on the ground.

viewing was done through a JavaScript viewer. Could be viewed in HTC Vive

Thanks for sharing, @craig ! Sending local webrtc from z1 would work and previewing it in browser of meta quest and other devices in theory in 4k and h.264 only, without audio, because no proper audio encoder available in theta devices for webrtc. But Im busy with other things now, so not sure when will I be able to get there. Still for quality I aim for it would take more effort and not sure about it at all.

It would seem better to focus on RTSP to vlc-unity, not webrtc at the moment.

1 Like

with my new plugin I’m able to stream to desktop VLC directly already, but instead of RTSP I used SRT. I’m not aware of vlc-unity and vlc differences, but I assume it should work toward Unity too with h.265 encoding, wirelessly.

Did you have to add any type of module to VLC to get it to ingest SRT stream? I don’t see it listed in my Input / Codecs → Access modules

Do I need to install some type of plugin?


Hi @craig ,
I will prepare a video showing how to use it soon. I’ve VLC 3.0.18 here , but I checked the 3.0.20 version too, both of them is bundled with SRT module:

And I can connect to my module directly and pull and watch stream in h.265, h.264, HDR mode, etc. with extra low latency with full 4k quality, I wasn’t able to make VLC play this in 360 mode, I may need a custom build or extension and we are done here.

Now in VLC Unity, I assume if srt_input access module is there, it should work too the same way.

When my HDR Wireless Live Streaming plugin is turned on and running on Z1, connecting via WiFi to the same network as the device running VLC, I start the stream like this in VLC:

Where IP is the IP of Z1 port is the port set on Z1 for SRT. A stream ID may be added there or passphrase too if needed.

This is the preferred way of streaming as I was able to get the maximum quality and lowest latency at the same time, and can finetune in VLC and in camera the latency too.

This feature isn’t there yet in actual plugin version in the store, but I can provide you APK to test if would like to. I finished this recently, doing some more tests and will send to Ricoh to update. Btw. I couldn’t find the web form where to submit plugin update, could you help please?