Connecting RICOH THETA Z1 with Unity

I am trying to stream a video using theta z1 , through USB by taking camera texture as a skybox material.

I followed this tutorial, and I have this error.

When define camNam= " RICOH THETA Z1 4K" I got this error: Couldn’t find a supported mediatype for the selected Virtual Camera

When define camName = “RICOH THETA Z1”; I got this error:
Could not connect pins - RenderStream()

Here is my code

public class Get360Camera : MonoBehaviour
{

    static WebCamTexture Camera;
    string camName = "RICOH THETA Z1 FullHD"; // Name of your camera. 
    public Material camMaterial;  // Skybox material

    void Start()
    {

        WebCamDevice[] devices = WebCamTexture.devices;

        //printing out all video sources for debugging purposes

        Debug.Log("Number of web cams connected: " + devices.Length);

        for (int i = 0; i < devices.Length; i++)

        {

            Debug.Log(i + " " + devices[i].name);

        }


        if (Camera == null)
            Camera = new WebCamTexture(camName, 3840, 1920); // Resolution you want

        if (!Camera.isPlaying)
            Camera.Play();

	if (camMaterial != null)
            camMaterial.mainTexture = Camera;

    }


}

That tutorial is 6 years old. The new driver will work directly with Unity. Use the driver here:

Live video is not displayed on a Computer [RICOH THETA Z1] | RICOH THETA

Hi there,

I’ve been attempting to connect my Z1 camera with Unity, but unfortunately, I’ve hit a snag.

I’m currently using Unity version 2022.3.17f1 along with the 3.0.0 UVC driver.

Interestingly, I’ve had no issues connecting the camera to OBS. However, whenever I attempt to do the same in Unity, the program crashes without displaying any error messages. I’ve tested my code with various other cameras, and it seems to function properly.

I’ll provide the code below for reference:

using UnityEngine;

public class getRicohStream : MonoBehaviour
{
    void Start()
    {
        WebCamDevice[] devices = WebCamTexture.devices;

        //printing out all video sources for debugging purposes

        Debug.Log("Number of web cams connected: " + devices.Length);

        for (int i = 0; i < devices.Length; i++)
        {
            Debug.Log(i + " " + devices[i].name);
        }

        Renderer rend = this.GetComponentInChildren<Renderer>();

        WebCamTexture mycam = new();
        string camName = devices[0].name;
        mycam.deviceName = camName;

        Debug.Log("The camera name is: " + camName);
        rend.material.mainTexture = mycam;

        mycam.Play();
    }
}

Thank you for your assistance.

Have you read this post?

1 Like

First and foremost, thank you for your prompt response!

As I was perusing the forum earlier today in search of a solution, I did indeed come across the topic you mentioned. Utilizing OBS as a virtual camera allowed me to integrate the Z1 with Unity.

However, having just completed my testing, I’m uncertain whether the lower image quality and darker appearance stem from settings within the OBS virtual camera or within Unity itself. Notably, the image quality I obtain directly from OBS is significantly better.

Nevertheless, I intend to continue exploring this solution further. In the meantime, I’d like to inquire whether anyone has discovered an alternative method to bypass the OBS virtual camera or has insights into why Unity crashes when directly connected.

I’m not that familiar with Unity and don’t know if there’s a way to increase the brightness inside of Unity. I do not believe the brightness of the Z1 can be adjusted for the livestream within the camera settings.

@biviel has a plugin that can output RTSP. It may be improve the Unity connection as he’s actively developing it and may be able to adjust the stream characteristics from the camera. You may need to use this VideoLAN / vlc-unity · GitLab (free if you build from source) to get the RTSP into Unity. (untested)

A member in the community tested the THETA with C#. We have an unconfirmed theory that using AForge may work. I personally have not tried it. Also, to be honest, I don’t know Unity well enough to assess if the technology would help.

Google Code Archive - Long-term storage for Google Code Project Hosting.

If you run Google Translate, you can read this article:

https://whoopsidaisies.hatenablog.com/entry/2013/11/13/014226

image001

Thank you for your assistance!

I will delve into it further and explore potential solutions.
Should I make any progress, I’ll be sure to post it.

Keeping my fingers crossed for a positive outcome!

1 Like

have you tested the source filters of OBS?

Greetings,

No, I haven’t tested the OBS filters.

However, I discovered that the issue was with Unity. After some testing, I realized that I was using the wrong type of material, Lit, instead of Unlit. Additionally, I encountered a peculiar situation where I had to choose “Universal Render Pipeline/Unlit” rather than the standard Unlit option to render the inside of the sphere correctly, despite flipping the normals.

Here are some screenshots comparing the Before (Lit Material) with the After (Unlit Material):

I also want to share another observation. Initially, when I projected the 360º image onto the sphere, I noticed some distorted lines, especially around the plinth. I experimented with changing the shape of the sphere, making it more oval or even flattening some sides, but these adjustments didn’t yield the desired results. What proved to be more effective in reducing distortion was adjusting the size of the sphere. Ultimately, I settled on a diameter of 1f. While I haven’t yet determined a direct correlation between sphere size and lens type, through trial and error, I found that this size minimized distortion the most.

In the images I used to showcase the difference between Lit and Unlit materials, one can also observe the disparity in distortion. In the first image with the Lit material, the distortion in the lines is much clearer, whereas in the second image with the Unlit material, the distortion was significantly reduced.

2 Likes

You did this by connecting z1 to obs via usb as webcam and then from obs which format (rtno, webrtc, etc?) and encoding and resolution was used to stream to quest 2? In quest you were running a unity app?

Wow! Thanks for sharing these tips. I’m sure these will help other people. As I mentioned, I’m not that experienced with Unity and didn’t know about the unlit versus lit material.

Additionally, I encountered a peculiar situation where I had to choose “Universal Render Pipeline/Unlit” rather than the standard Unlit option to render the inside of the sphere correctly, despite flipping the normals.

Are you using the skybox as suggested by RicoB?

We were originally using a sphere with flip normals. However, most people started using the skybox technique to avoid the vertices.

The picture in your example looks fantastic.

1 Like

This is really cool! Thanks for posting. I don’t use Unity with THETA much but I did a VR demo at DeveloperWeek several years ago and I remember us having I believe the exact same issue. Thanks for the information!

1 Like

How I achieved live feed visualization from Theta Z1 in Unity using two different methods:

First Method - USB:

  1. Installed the UVC Driver.
  2. Connected the camera to the PC via USB.
  3. Set up the camera source in OBS as Video Capture Device.
  4. To pass the live feed to Unity, I utilized the virtual camera feature in OBS, as described in my initial post.
  5. Finally, the image is projected onto a sphere with inverted normals (I used an Unlit texture for the appropriate coloration and brightness).

Second Method - RTSP Connection (implemented this afternoon):

  1. Installed the Theta RTSP Streaming plug-in on the Z1.
  2. Connected the Z1 via CL (currently tested on 2.4GHz).
  3. Identified the camera’s IP.
  4. In OBS, selected Media Source as the source and input the following URL: rtsp://user:pass@camIP:8554/live?resolution=1920x960.
    (We can use diferent resolutions as stated in: THETA RTSP Streaming)
  5. To stream to Unity, I’m still using the Virtual cam.

In terms of virtual reality, I’m using the HTC Vive Pro 2 kit in conjunction with Steam VR.

Currently, due to a considerable delay (I haven’t tested to determine the exact value yet), I’m attempting to use the 5GHz connection, but I’m not succeeding because the camera is not able to connect to the router. Does anyone has any ideas regarding this aspect?

2 Likes

No, I’ve been using a sphere, but I’m going to give it a try with the skybox.

1 Like

You can set 5GHz in the API or the mobile app. The Z1 will not automatically switch from 2.4GHz to 5GHz. This is a one-time setup.

with api

theta-api-specs/theta-web-api-v2.1/options/_wlan_frequency.md at main · ricohapi/theta-api-specs · GitHub

With mobile app

https://support.theta360.com/en/manual/z1/content/settings/settings_02.html

wow, good memories from the past. June 2018.

Did you show the moth in the demo?

I remember that I was interested in Mothra from the Godzilla movies and I was trying to get an asset similar to Mothra for the demo.

2 Likes

I see, I wonder how much better would it look like in HDR mode and h.265 encoding. 1920x960 seems a bit low for a HMD device.

Are you familiar with VLC-Unity? I’m aiming to directly to stream into headsets, without need of a PC and OBS Studio, using SRT stream directly of the plugin I developed. So in theory could stream to Unity on Meta Quest 2/3 but I assume other Unity compatible devices should work. Grabbing HDR preview of the Z1 in camera makes a huge difference in image quality and adjusting brightness, etc. in camera. Of course plugin also works to stream to OBS as capture device wirelessly.

So Unity is running on a high end PC and HTC Vive Pro 2 is used as display connecting with wires?

In the old demo from June 2018, we had the HTC Vive connected to a Windows PC with wires. Back then (5 or 6 years ago), we needed a gaming laptop with discrete mid-range GPU card (higher end for laptops) for demos. In 2024, think that a normal laptop can handle things

I have not tried VLC-Unity yet. It’s a library, so we would need some sample code to get the stream into Unity.

Note that RicoB had the comment below regarding the inverted sphere.

if you decide to use it [inverted sphere] in room-scale VR - the sphere won’t move with you and it’ll just be weird (unless you make the sphere huge).

I’m not sure if that consideration is relevant to your application.

I’ve heard of it, but I haven’t worked with it yet

I still have little experience of working with cameras, but I imagine HDR will help a lot.

As @craig said,

Personally, I’m using a laptop with 32 Gb of ram, a Ryzen 9 7945HX and an RTX 4060, and I’ve had no problems.
In terms of the HTC Vive, (I hope I’m not talking rubbish), but from what I know, I always need an auxiliary computer to run the application. However, PC - Headseat communication is wireless.

1 Like