Connecting RICOH THETA Z1 with Unity

For now I’m using my laptop’s hotspot which is only few feet away from the Z1 at most. My laptop’s hotspot is the only source of WiFi I can use while at my college (Rowan University). The reason I can’t use my colleges WiFi is because I can’t connect the Z1 to it through client mode likely due to security reason’s with it being a college network, but if you have any possible ideas on how I could connect the Z1 to my college WiFi then it’d be appreciated.

Yes, same room. In past, ~3 firmware upgrades back, before Ricoh optimized WiFi usage because of overheating, this latency was lower for me.

Now I’m back to test WebRTC again and to try to play in browser full 4k, as good quality as possible. Latency is about 300ms.

Here you can see my desktop. In a browser running a counter and in another browser in front of it, the live 4k preview of my Z1 streaming into it WebRTC:

This is not supported yet in my plugin because of the use cases I was looking to cover in past and lack of h.265 encoding when using webrtc.

.

When you use OBS on the same laptop that you’re using as the hotspot, what is the CPU load shown on OBS? Is it showing something reasonable like under 10%?

image

I didn’t measure the latency between the Z1 and Unity when using RTSP plugin, but 2 seconds seems high. Just using RTSP plugin at 4K, I’m getting under 500ms inside the same room and not going to Unity, just Z1 to computer only.

I believe the @Manuel_Couto has it working from the RTSP plugin to OBS and then to Unity and seems to have it under 2 second latency, right?

Hello again,

I promised an update yesterday about the wireless connection, but I’m still working on it.

For now, using WiFi and a 2.4GHz connection, I measured approximately 1.01 seconds latency for 1920x960 resolution and about 1.42 seconds for 3840x1920. (This is from Z1 to Unity)

→ 1920x960: 18.60 - 17.59 = 1.01s

→ 3840x1920: 17.72 - 16.30 = 1.42s

Since I intend to use this for VR teleoperation and require very low latency, I have been studying using other protocols and implementing 5G.

Today, I tried to implement 5G by connecting the Z1 with an Ethernet cable to the router and then using 5G to connect to the PC, but I’m still encountering some issues.

I will keep you informed when I get it working.

PS: @craig, regarding the problem I was having with transitioning from the 2.4GHz to 5GHz band connection, it turned out to be an issue with the router. Today, I tried it out with another router, and it worked fine with 5GHz.

Also, please note that the tests above were done yesterday with the old router that was having problems, so they might not be the most trustworthy source of information. As soon as I get new results, I will post them.

3 Likes

You used RTSP Plugin, right? H264 encoding, 2k and 4k and I assume equirectangular projection.

I’m wondering, is equirectangular needed at all? With my plugin I’m able to stream double fisheye directly and it generates lower heat. Using my plugin and fisheye and 24FPS it would work for hours in a 26-28°C environment, using SRT. Regarding latency, h.265 is important because it requires lower bandwidth, that also helps a bit with latency. Honestly I don’t know what is that RTSP plugin capable of.

Does he have access to your plugin?

I believe he’s using stream in a headset. He’ll need to get the frames onto a sphere at some point.

There is a long thread about using Unity to stitch the stream. However, the stitch is not as good as the stitch from the camera.

@biviel do you know the latency improvement if the camera doesn’t have to stitch to equirectangular?

I’m new to the world of cameras and protocol communication, so I’m still learning as I go.

Currently, I’m using the THETA RTSP Streaming plugin. I’ve tested it with both 2k and 4k resolutions. I’m not certain about the encoding it’s using, nor am I sure how to check or change it.

As for the projection type, based on @craig’s comment, I believe I’m using the standard equirectangular projection.

Yes, I capture the live feed with OBS, convert it into a Virtual Camera, and then project it onto a Unity Sphere, which I can view with the HTC Vive headset.

When using OBS on the same laptop as the hotspot, the CPU load shown in OBS varies from less than 1% to around 6%, so yes it’s reasonable. At the moment we’re having another issue regarding using the camera with Unity. We can only enter play mode in Unity successfully when the Ricoh Theta Z1 is wired to the laptop and not connected wirelessly. When trying to use the wireless connection or trying to enter play mode without the camera connected at all, Unity crashes when trying to enter play mode. We (@Caleb_Amadoro) believe this is from the code we are currently using to attach the Ricoh Theta live video feed to a material on the surface of a sphere. Could you let us know what code you’re using for the sphere material or any other code that could potentially fix this issue?

@craig @jcasman @biviel I’m back with some updates!

Over the past few weeks, I’ve been focused on live streaming from my Z1 to a Unity app while aiming for minimum latency. Currently, I’ve achieved an average latency of 544 ms.

Here’s the setup:

  • A computer (PC1) connected via cable to the Z1. On this PC, OBS converts the video feed into a virtual camera. Subsequently, I access the virtual camera through a Unity app, that utilizes WebRTC to transmit to a second PC (PC2).
  • PC2 runs the client-side version of the ChatApp.
  • Both PCs are connected via Ethernet to 5G routers (PC1 is connected to R1, and PC2 is connected to R2).

For testing purposes, I compare the stopwatch time at four different points:

  • Stopwatch: Real-time value
  • OBS: Capturing latency
  • PC1: Latency between OBS and Unity communication
  • PC2: WebRTC latency

Below is an image of the setup used:

Additionally, here’s a screenshot with some test results and average values:

Upon analyzing the Excel sheet, it’s evident that the major latency occurs during the capturing process (Z1 to OBS and OBS to Unity). Surprisingly, the transmission latency is only 60 ms.

Moving forward, I plan to focus on reducing the capturing latency as much as possible.

I’d love to hear your thoughts on this!

2 Likes

This is great information.

Is there any difference if you stream 2K instead of 4K? I was just wondering as the stitching time for each frame is consuming a significant portion of the time needed to get output from the camera over USB.

Can a Mac run Unity? You may be able to avoid the virtual camera with OBS if you use a Mac for a test. I have not tried.

@craig When using OBS on the same laptop as the hotspot, the CPU load shown in OBS varies from less than 1% to around 6%, so yes it’s reasonable. At the moment we’re having another issue regarding using the camera with Unity. We can only enter play mode in Unity successfully when the Ricoh Theta Z1 is wired to the laptop and not connected wirelessly. When trying to use the wireless connection or trying to enter play mode without the camera connected at all, Unity crashes when trying to enter play mode. We (@Caleb_Amadoro) believe this is from the code we are currently using to attach the Ricoh Theta live video feed to a material on the surface of a sphere. Could you let us know what code you’re using for the sphere material or any other code that could potentially fix this issue?

Hi @HunterGeitz,

This is what I have done in my code, in the start you check if you have any webCams connected, if not, you end the program.

// Gets the list of devices and prints them to the console.
    private void Start()
    {
        WebCamDevice[] devices = WebCamTexture.devices;
        for (int i = 0; i < devices.Length; i++)
            Debug.Log(devices[i].name);

        defaultBackground = pcImage.texture;

        if (devices.Length == 0)
        {
            Debug.Log("No devices connected!");

            pcCamAvaileble = false;
            phoneCamAvaileble = false;

            return;
        }
        
        pcCam = new WebCamTexture(devices[0].name, Screen.width, Screen.height);
        pcCam.Play();
        pcImage.texture = pcCam;

        pcCamAvaileble = true;

        if (devices.Length > 1)
        {
            phoneCam = new WebCamTexture(devices[1].name, Screen.width, Screen.height);
            phoneCam.Play();
            phoneImage.texture = phoneCam;

            phoneCamAvaileble = true;
        }
    }

Ps: If you need to specify the camera, you can compare the camera name to a string. If it returns false, you end the program, or you can wait for a connection (be careful with infinite loops…).

if( devices[i].name != "My webcam name")
{
     camAvaileble = false;
     return;
}
1 Like

Regarding the 2K or 4K resolution, I’m uncertain. How can I verify if the camera is set to 2K or 4K?

All I did was adjust the capture resolution in OBS, but I believe this doesn’t affect the camera settings.

As for the Mac, I’m unsure. While Unity allows you to compile apps for almost every platform, I’m uncertain if you can run the Unity Editor on a Mac. Furthermore, I lack a Mac to conduct any tests :face_with_diagonal_mouth: (I’m a Google Fan :sweat_smile:).

This document has some information. It is not confidential as it is 2.5 years old.

private_beta_test_guide_2021_12_16.pdf - Google Drive

Note that although the document shows Unity running directly with a USB cable, something has changed in either Windows or Unity. When I tested Unity last year, I couldn’t get the stream to work unless I used the OBS virtual cam.

I seem to recall that the color format might impact Unity. Back when the driver was first written, I got some information from the developer of the driver regarding Unity.

With YUY2 color format output, you can use the driver with Unity and JavaScript Media API.

In addition to NV12 (native color format of the decoder), YUY2 color format is supported for Unity or JavaScript Media API.

Single driver provides all available image size/color format. For compatibility with non-standard Unity Webcam interface, default output is set to 3840x1920/YUY2. The default output is not the driver’s preferred format.

2 Likes

I have some great news!

Following @craig post:

I recently conducted tests using the older Unity version (2020.3.5f1) to determine whether the issue lay with Unity or Windows 11.

It turns out the problem stemmed from Unity itself. Upon further investigation, I stumbled upon this bug report: CRASH ON BASEVIDEOTEXTURE::INITVIDEOMEMORY WHEN CAPTURING THETA V’S OUTPUT TEXTURE

The good news? Unity has already developed a solution for it.

I proceeded to test it with the beta version (as I was unaware of other options at the time! :sweat_smile: ):
Unity 6000.0.0 Beta 11


Now, we no longer require OBS, we can directly utilize the UVC driver within Unity.

Here’s an update on my latency testing:

The screenshot indicates a notable enhancement, with a reduction in latency of 163 ms, equating to a 38% improvement.

I came across an interesting observation when comparing the 2020 version with the Beta release: the time Unity takes to display the camera feed. I’ve termed this the “Average Unity Display Time.” Since I don’t have a direct means to measure this time, I estimated it by subtracting the average latency of the current method from the average latency I obtained from the Z1 to OBS.

My assumption was that OBS instantly displays the feed, so the measured time was the duration it took for the Z1 to capture the image (265 ms).

With this in mind, here are my findings:

  • In the 2020 version, Unity introduced 48 ms of latency.
  • In the Beta release, we get -3 ms. This can be considered an oscillation in measurements, suggesting that Unity didn’t introduce any additional latency compared to OBS.

P.S. @jcasman, as I was browsing through Unity’s Forum, I stumbled upon one of your posts (Solution: Connecting RICOH THETA Z1 with Unity). I also left a comment there, sharing this update.

2 Likes

Wow, that’s fantastic. Nice digging and finding the error, the bug report, and the Unity solution! :tada: :medal_military:

I see your post in the Unity forum, too. That’s great to let Unity developers know about the solution.

Really impressive!

1 Like

I’m planning to delve deeper into the issue of resolution next, as it seems that the capturing time is the primary contributor to latency (around 260 ms).

I’m curious if there’s a method to set the output resolution of the Z1 to either 1920 or 3840 directly. While it was suggested that this could be achieved through OBS, I’m exploring alternative approaches since I’m no longer utilizing OBS.

Any insights or suggestions on this matter?

2 Likes

@Manuel_Couto I believe that you cannot set the output resolution of the Z1 for live streaming through the API.

However, my experience here is limited. @biviel, @craig or others may have some more insight.

From the RICOH API documentation:

1 Like

This is not that relevant since the example is with Linux.

The camera supports 2K streaming. It’s defined in the driver.

this provides some hint.

https://forum.unity.com/threads/is-there-any-possible-to-change-camera-quality-used-by-webcamtexture.461106/

mCamera = [new](http://www.google.com/search?q=new+msdn.microsoft.com) WebCamTexture (1920, 1080, 30);

I have not tried it yet.

Maybe it would work?

It does seem that the resolutions are for the texture.

1 Like

Is there anyway you could share your unity file containing the scripts for putting the output of the Z1 directly to the sphere in Unity without OBS? I’m assuming you still use the RTSP plug in to connect the Z1 wirelessly but don’t know how you would connect it directly to Unity without OBS.