Connecting RICOH THETA Z1 with Unity

This is great information.

Is there any difference if you stream 2K instead of 4K? I was just wondering as the stitching time for each frame is consuming a significant portion of the time needed to get output from the camera over USB.

Can a Mac run Unity? You may be able to avoid the virtual camera with OBS if you use a Mac for a test. I have not tried.

@craig When using OBS on the same laptop as the hotspot, the CPU load shown in OBS varies from less than 1% to around 6%, so yes it’s reasonable. At the moment we’re having another issue regarding using the camera with Unity. We can only enter play mode in Unity successfully when the Ricoh Theta Z1 is wired to the laptop and not connected wirelessly. When trying to use the wireless connection or trying to enter play mode without the camera connected at all, Unity crashes when trying to enter play mode. We (@Caleb_Amadoro) believe this is from the code we are currently using to attach the Ricoh Theta live video feed to a material on the surface of a sphere. Could you let us know what code you’re using for the sphere material or any other code that could potentially fix this issue?

Hi @HunterGeitz,

This is what I have done in my code, in the start you check if you have any webCams connected, if not, you end the program.

// Gets the list of devices and prints them to the console.
    private void Start()
    {
        WebCamDevice[] devices = WebCamTexture.devices;
        for (int i = 0; i < devices.Length; i++)
            Debug.Log(devices[i].name);

        defaultBackground = pcImage.texture;

        if (devices.Length == 0)
        {
            Debug.Log("No devices connected!");

            pcCamAvaileble = false;
            phoneCamAvaileble = false;

            return;
        }
        
        pcCam = new WebCamTexture(devices[0].name, Screen.width, Screen.height);
        pcCam.Play();
        pcImage.texture = pcCam;

        pcCamAvaileble = true;

        if (devices.Length > 1)
        {
            phoneCam = new WebCamTexture(devices[1].name, Screen.width, Screen.height);
            phoneCam.Play();
            phoneImage.texture = phoneCam;

            phoneCamAvaileble = true;
        }
    }

Ps: If you need to specify the camera, you can compare the camera name to a string. If it returns false, you end the program, or you can wait for a connection (be careful with infinite loops…).

if( devices[i].name != "My webcam name")
{
     camAvaileble = false;
     return;
}
1 Like

Regarding the 2K or 4K resolution, I’m uncertain. How can I verify if the camera is set to 2K or 4K?

All I did was adjust the capture resolution in OBS, but I believe this doesn’t affect the camera settings.

As for the Mac, I’m unsure. While Unity allows you to compile apps for almost every platform, I’m uncertain if you can run the Unity Editor on a Mac. Furthermore, I lack a Mac to conduct any tests :face_with_diagonal_mouth: (I’m a Google Fan :sweat_smile:).

This document has some information. It is not confidential as it is 2.5 years old.

private_beta_test_guide_2021_12_16.pdf - Google Drive

Note that although the document shows Unity running directly with a USB cable, something has changed in either Windows or Unity. When I tested Unity last year, I couldn’t get the stream to work unless I used the OBS virtual cam.

I seem to recall that the color format might impact Unity. Back when the driver was first written, I got some information from the developer of the driver regarding Unity.

With YUY2 color format output, you can use the driver with Unity and JavaScript Media API.

In addition to NV12 (native color format of the decoder), YUY2 color format is supported for Unity or JavaScript Media API.

Single driver provides all available image size/color format. For compatibility with non-standard Unity Webcam interface, default output is set to 3840x1920/YUY2. The default output is not the driver’s preferred format.

2 Likes

I have some great news!

Following @craig post:

I recently conducted tests using the older Unity version (2020.3.5f1) to determine whether the issue lay with Unity or Windows 11.

It turns out the problem stemmed from Unity itself. Upon further investigation, I stumbled upon this bug report: CRASH ON BASEVIDEOTEXTURE::INITVIDEOMEMORY WHEN CAPTURING THETA V’S OUTPUT TEXTURE

The good news? Unity has already developed a solution for it.

I proceeded to test it with the beta version (as I was unaware of other options at the time! :sweat_smile: ):
Unity 6000.0.0 Beta 11


Now, we no longer require OBS, we can directly utilize the UVC driver within Unity.

Here’s an update on my latency testing:

The screenshot indicates a notable enhancement, with a reduction in latency of 163 ms, equating to a 38% improvement.

I came across an interesting observation when comparing the 2020 version with the Beta release: the time Unity takes to display the camera feed. I’ve termed this the “Average Unity Display Time.” Since I don’t have a direct means to measure this time, I estimated it by subtracting the average latency of the current method from the average latency I obtained from the Z1 to OBS.

My assumption was that OBS instantly displays the feed, so the measured time was the duration it took for the Z1 to capture the image (265 ms).

With this in mind, here are my findings:

  • In the 2020 version, Unity introduced 48 ms of latency.
  • In the Beta release, we get -3 ms. This can be considered an oscillation in measurements, suggesting that Unity didn’t introduce any additional latency compared to OBS.

P.S. @jcasman, as I was browsing through Unity’s Forum, I stumbled upon one of your posts (Solution: Connecting RICOH THETA Z1 with Unity). I also left a comment there, sharing this update.

2 Likes

Wow, that’s fantastic. Nice digging and finding the error, the bug report, and the Unity solution! :tada: :medal_military:

I see your post in the Unity forum, too. That’s great to let Unity developers know about the solution.

Really impressive!

1 Like

I’m planning to delve deeper into the issue of resolution next, as it seems that the capturing time is the primary contributor to latency (around 260 ms).

I’m curious if there’s a method to set the output resolution of the Z1 to either 1920 or 3840 directly. While it was suggested that this could be achieved through OBS, I’m exploring alternative approaches since I’m no longer utilizing OBS.

Any insights or suggestions on this matter?

2 Likes

@Manuel_Couto I believe that you cannot set the output resolution of the Z1 for live streaming through the API.

However, my experience here is limited. @biviel, @craig or others may have some more insight.

From the RICOH API documentation:

1 Like

This is not that relevant since the example is with Linux.

The camera supports 2K streaming. It’s defined in the driver.

this provides some hint.

https://forum.unity.com/threads/is-there-any-possible-to-change-camera-quality-used-by-webcamtexture.461106/

mCamera = [new](http://www.google.com/search?q=new+msdn.microsoft.com) WebCamTexture (1920, 1080, 30);

I have not tried it yet.

Maybe it would work?

It does seem that the resolutions are for the texture.

1 Like

Is there anyway you could share your unity file containing the scripts for putting the output of the Z1 directly to the sphere in Unity without OBS? I’m assuming you still use the RTSP plug in to connect the Z1 wirelessly but don’t know how you would connect it directly to Unity without OBS.

I just ran a test with Unity and a THETA X without using OBS. This should also work with the Z1.

2 Likes

Greetings,

Because I’m aiming to achieve the lowest possible latency, I have opted out of the RTSP approach. Instead, I am now using the Z1 connected to the server PC and utilizing WebRTC to livestream to Unity.

The objective is to utilize a small PC, such as a Raspberry Pi or Jetson, to act as a server to capture the stream from the camera and transmit it via WebRTC.

As for RTSP, Unity does not directly support it; you would need a plugin or OBS for that. There is also another method for wireless transmission; however, this approach will limit you to approximately 10 FPS. (I can’t find the original post for this :face_with_diagonal_mouth:)

It’s possible to use motionjpeg direct to the headset with WifFi if you use browser technology.

1 Like

Since I’ve been talking to ChatGPT, I asked it if I can use RTSP inside of Unity.

I got this response from ChatGPT, but did not try it.


Answer below from ChatGPT. Human did not test

Viewing an RTSP stream within Unity requires a bit more effort since Unity doesn’t natively support RTSP streams. You typically need to use a third-party plugin or library that can handle the RTSP protocol and video decoding, then display the video within your Unity scene. One of the more common approaches is to use the VLC Media Player library, which is capable of playing a wide variety of video formats, including RTSP streams. The LibVLCSharp wrapper allows you to use VLC functionalities within Unity projects.

Here’s a high-level overview of how you can achieve RTSP streaming within Unity using LibVLCSharp:

Step 1: Install LibVLCSharp in Unity

  1. Add the LibVLCSharp Unity Package: You will likely need to import the LibVLCSharp package for Unity. This can usually be done via the NuGet package manager or by manually adding the necessary DLLs to your Unity project. Check the LibVLCSharp GitHub repository for the latest instructions on how to do this in Unity.

Step 2: Set Up the Video Player

  1. Create a Script to Play the RTSP Stream: You will write a script that uses LibVLCSharp to play the RTSP stream. This script will be responsible for initializing VLC, setting up the media player, and playing the stream.

Example Script

Here is a simplified example of what this script might look like:

csharpCopy code

using System;
using UnityEngine;
using LibVLCSharp.Shared;

public class RTSPStreamPlayer : MonoBehaviour
{
    [SerializeField]
    private string rtspUrl = "your_rtsp_stream_url_here";

    private LibVLC _libVLC;
    private MediaPlayer _mediaPlayer;

    void Start()
    {
        // Initialize LibVLC
        Core.Initialize();
        _libVLC = new LibVLC();

        // Create a new MediaPlayer
        _mediaPlayer = new MediaPlayer(_libVLC);

        // Create a new media for the RTSP URL
        using (var media = new Media(_libVLC, rtspUrl, FromType.FromLocation))
        {
            _mediaPlayer.Play(media);
        }
    }

    void OnDestroy()
    {
        _mediaPlayer.Stop();
        _mediaPlayer.Dispose();
        _libVLC.Dispose();
    }
}

Step 3: Displaying the Stream

To display the video in your scene, you can use a VideoView object, which is part of LibVLCSharp. Attach the VideoView component to a GameObject in your scene, and reference it in your script to render the video output. You may need to adapt and extend the script to properly integrate with your Unity project, handling aspect ratios, and ensuring performance is optimized.

Additional Considerations

  • Performance: Streaming video, especially over RTSP, can be resource-intensive. Test performance and adjust your stream’s resolution and bitrate as needed.
  • Compatibility: Ensure that your target platforms (Windows, macOS, Android, iOS, etc.) support the version of LibVLCSharp you are using.
  • Security: If your RTSP stream is over the internet, consider the security implications and ensure your stream is properly secured.

This approach requires integrating external libraries and possibly dealing with platform-specific quirks, so be prepared for a bit of trial and error to get everything working smoothly.

1 Like

That’s the one I was looking for (post).

I tried it out, but I was unable to get more than 10 fps :frowning:

1 Like

I believe that your solution with a server is much better. The motionjpeg from live preview is going to have a lower fps and lower resolution.

How do you get Unity to consume the WebRTC stream? Are you using a separate server for signaling or STUN/TURN?

There were a number of projects in the past using ntt skyway

Many years ago, RICOH had a web conferencing service that used WebRTC.

Currently, I’m utilizing the Bidirectional sample from Unity Render Streaming.

While I could provide guidance on setting it up, Unity does an excellent job of that already. However, if anyone requires assistance, I’m available to offer support.

Looking ahead, my goal is to enhance the sample. As I’m focusing on teleoperation, one of the webRTC peers will be a robot, necessitating a headless setup without human intervention. This setup could also prove useful, for instance, when traveling—you can simply connect your Theta to a Raspberry Pi and let it handle the rest. Additionally, I aim to enable streaming from multiple cameras.

For everyone else just trying to play with webRTC and Unity, the sample is a great tool.

(There’s one line in the code that needs to be added so you can retrieve the texture and output it to a sphere or skybox texture. However, since I don’t have access to my work PC at the moment, I can’t say for sure which line it is, but I will update this topic on Monday.)

Regarding the basic workflow and PC requirements for this:

  • Unity provides a web app server for signaling, which runs on one of the peers. You use the browser to select your camera, resolution, microphone, encoding method, and to see the other person’s live feedback. The second person is connected via the Unity sample.
  • The PC requirements for this setup aren’t too demanding. As I mentioned, I’m pretty sure a simple Raspberry Pi can handle the browser side (not tested yet), while a regular PC can run the Unity app. The advantage of having a better PC is for encoding and decoding, as it can help reduce latency.
1 Like

Would you be willing to make a step by step guide for this setup? I’m also aiming for the lowest possible latency and have no experience with WebRTC so I would greatly appreciate it.

Sure, I can make something, although, I will only be able to do it next week.