Stream video data wirelessly from Ricoh Theta Z1 to Unity?

Hi, how may I stream video data wirelessly from Ricoh Theta Z1 to Unity? There is a plugin to stream to YouTube, but how may I just stream it to Unity?

Most people are using a USB cable to the computer running the Unity app. See the topics below.

This may work.

Thanks, Craig! I could stream the video data from Ricoh Theta Z1 to Unity then to Vive Cosmos VR (SteamVR plugin). However, it lags a lot to load the image onto the texture of the sphere. Do you know how I may speed this up?

Does it speed up with a Skybox instead of a texture?

I don’t have a Vive Cosmos. What plugin are you using? The SteamVR plug-in?

I have tried using Skybox instead. The lagging issue is still the same. Every time I rotate the VR headset, the view is not rendered fast enough so I see black for 1-2 seconds before seeing the image. Any suggestions?

I’m not sure how SteamVR works with the Vive Cosmos. Are you running another program on a PC or does it run only on the Vive Cosmos?

If it is running only on the Vive Cosmos, how do you connect the camera to the Vive Cosmos?

If you are connecting the camera to the Vive Cosmos directly with Wi-Fi, what protocol are you using? Are you streaming RTSP or something or are you using motionJPEG?

I’m streaming the video data from the Ricoh Theta Z1 to Unity which then streams to the VIVE Cosmos VR through SteamVR plugin (a package in the Unity project). Is there another/better way to stream data wirelessly from Ricoh camera to the VR headset? We want to also record the data for future reference.

Do you have something like a Windows PC that is connected to the RICOH THETA Z1 with a USB cable and the RICOH THETA Z1 appears to the Windows PC as an USB webcam?

Does it run on the Windows PC smoothly if you don’t transmit it to the Vive Cosmos headset using Wi-Fi?

Does your view look like this on the PC?

I guess the one shown in the video was streamed through a USB cable from the camera to Windows PC? I streamed the camera’s data wirelessly using HTTPWebRequest in Unity. The view in Unity looks like that, but lags a lot (~ 1 sec) and when it uploades to the wire-connected VIVE Cosmos VR headset, the Skybox material updating also lags, especially when the headset is rotated.

Why are you connecting the camera to the Windows PC with HTTPWebRequest?

I do not understand what, “connect to Unity” means with regards to the physical device camera. What is the connection between the physical device camera and the physical device computer, not the software connection.

If you’re using WiFi to connect the physical device THETA and the physical device computer, can you use a USB cable?

We will put the camera on a robot so we cannot connect the THETA camera to the PC: we need to connect the camera to the PC wirelessly.

Is there any way I may reduce the resolution of the Ricoh Theta streaming, which may help with reducing the lagging?
I guess there’s a way to set that with the camera.getLivePreview in C# script in Unity?

For the livePreview, which is motionJPEG, you can set the resolution with:


livePreview is not designed for streaming. It is designed to supply a preview of the image scene before a picture is taken.

If you use it for streaming, you will have lower resolution or lower framerate.

Most people put the THETA Z1 on a robot with a USB cable attached to a Jetson Nano or a Raspberry Pi and then use the small computer to transmit the stream.

The Amelia drone project did use motionJPEG livePreview.

It streams into a headset.