Connecting RICOH THETA Z1 with Unity

@Manuel_Couto I believe that you cannot set the output resolution of the Z1 for live streaming through the API.

However, my experience here is limited. @biviel, @craig or others may have some more insight.

From the RICOH API documentation:

1 Like

This is not that relevant since the example is with Linux.

The camera supports 2K streaming. It’s defined in the driver.

this provides some hint.

https://forum.unity.com/threads/is-there-any-possible-to-change-camera-quality-used-by-webcamtexture.461106/

mCamera = [new](http://www.google.com/search?q=new+msdn.microsoft.com) WebCamTexture (1920, 1080, 30);

I have not tried it yet.

Maybe it would work?

It does seem that the resolutions are for the texture.

1 Like

Is there anyway you could share your unity file containing the scripts for putting the output of the Z1 directly to the sphere in Unity without OBS? I’m assuming you still use the RTSP plug in to connect the Z1 wirelessly but don’t know how you would connect it directly to Unity without OBS.

I just ran a test with Unity and a THETA X without using OBS. This should also work with the Z1.

2 Likes

Greetings,

Because I’m aiming to achieve the lowest possible latency, I have opted out of the RTSP approach. Instead, I am now using the Z1 connected to the server PC and utilizing WebRTC to livestream to Unity.

The objective is to utilize a small PC, such as a Raspberry Pi or Jetson, to act as a server to capture the stream from the camera and transmit it via WebRTC.

As for RTSP, Unity does not directly support it; you would need a plugin or OBS for that. There is also another method for wireless transmission; however, this approach will limit you to approximately 10 FPS. (I can’t find the original post for this :face_with_diagonal_mouth:)

It’s possible to use motionjpeg direct to the headset with WifFi if you use browser technology.

1 Like

Since I’ve been talking to ChatGPT, I asked it if I can use RTSP inside of Unity.

I got this response from ChatGPT, but did not try it.


Answer below from ChatGPT. Human did not test

Viewing an RTSP stream within Unity requires a bit more effort since Unity doesn’t natively support RTSP streams. You typically need to use a third-party plugin or library that can handle the RTSP protocol and video decoding, then display the video within your Unity scene. One of the more common approaches is to use the VLC Media Player library, which is capable of playing a wide variety of video formats, including RTSP streams. The LibVLCSharp wrapper allows you to use VLC functionalities within Unity projects.

Here’s a high-level overview of how you can achieve RTSP streaming within Unity using LibVLCSharp:

Step 1: Install LibVLCSharp in Unity

  1. Add the LibVLCSharp Unity Package: You will likely need to import the LibVLCSharp package for Unity. This can usually be done via the NuGet package manager or by manually adding the necessary DLLs to your Unity project. Check the LibVLCSharp GitHub repository for the latest instructions on how to do this in Unity.

Step 2: Set Up the Video Player

  1. Create a Script to Play the RTSP Stream: You will write a script that uses LibVLCSharp to play the RTSP stream. This script will be responsible for initializing VLC, setting up the media player, and playing the stream.

Example Script

Here is a simplified example of what this script might look like:

csharpCopy code

using System;
using UnityEngine;
using LibVLCSharp.Shared;

public class RTSPStreamPlayer : MonoBehaviour
{
    [SerializeField]
    private string rtspUrl = "your_rtsp_stream_url_here";

    private LibVLC _libVLC;
    private MediaPlayer _mediaPlayer;

    void Start()
    {
        // Initialize LibVLC
        Core.Initialize();
        _libVLC = new LibVLC();

        // Create a new MediaPlayer
        _mediaPlayer = new MediaPlayer(_libVLC);

        // Create a new media for the RTSP URL
        using (var media = new Media(_libVLC, rtspUrl, FromType.FromLocation))
        {
            _mediaPlayer.Play(media);
        }
    }

    void OnDestroy()
    {
        _mediaPlayer.Stop();
        _mediaPlayer.Dispose();
        _libVLC.Dispose();
    }
}

Step 3: Displaying the Stream

To display the video in your scene, you can use a VideoView object, which is part of LibVLCSharp. Attach the VideoView component to a GameObject in your scene, and reference it in your script to render the video output. You may need to adapt and extend the script to properly integrate with your Unity project, handling aspect ratios, and ensuring performance is optimized.

Additional Considerations

  • Performance: Streaming video, especially over RTSP, can be resource-intensive. Test performance and adjust your stream’s resolution and bitrate as needed.
  • Compatibility: Ensure that your target platforms (Windows, macOS, Android, iOS, etc.) support the version of LibVLCSharp you are using.
  • Security: If your RTSP stream is over the internet, consider the security implications and ensure your stream is properly secured.

This approach requires integrating external libraries and possibly dealing with platform-specific quirks, so be prepared for a bit of trial and error to get everything working smoothly.

1 Like

That’s the one I was looking for (post).

I tried it out, but I was unable to get more than 10 fps :frowning:

1 Like

I believe that your solution with a server is much better. The motionjpeg from live preview is going to have a lower fps and lower resolution.

How do you get Unity to consume the WebRTC stream? Are you using a separate server for signaling or STUN/TURN?

There were a number of projects in the past using ntt skyway

Many years ago, RICOH had a web conferencing service that used WebRTC.

Currently, I’m utilizing the Bidirectional sample from Unity Render Streaming.

While I could provide guidance on setting it up, Unity does an excellent job of that already. However, if anyone requires assistance, I’m available to offer support.

Looking ahead, my goal is to enhance the sample. As I’m focusing on teleoperation, one of the webRTC peers will be a robot, necessitating a headless setup without human intervention. This setup could also prove useful, for instance, when traveling—you can simply connect your Theta to a Raspberry Pi and let it handle the rest. Additionally, I aim to enable streaming from multiple cameras.

For everyone else just trying to play with webRTC and Unity, the sample is a great tool.

(There’s one line in the code that needs to be added so you can retrieve the texture and output it to a sphere or skybox texture. However, since I don’t have access to my work PC at the moment, I can’t say for sure which line it is, but I will update this topic on Monday.)

Regarding the basic workflow and PC requirements for this:

  • Unity provides a web app server for signaling, which runs on one of the peers. You use the browser to select your camera, resolution, microphone, encoding method, and to see the other person’s live feedback. The second person is connected via the Unity sample.
  • The PC requirements for this setup aren’t too demanding. As I mentioned, I’m pretty sure a simple Raspberry Pi can handle the browser side (not tested yet), while a regular PC can run the Unity app. The advantage of having a better PC is for encoding and decoding, as it can help reduce latency.
1 Like

Would you be willing to make a step by step guide for this setup? I’m also aiming for the lowest possible latency and have no experience with WebRTC so I would greatly appreciate it.

Sure, I can make something, although, I will only be able to do it next week.

That would be very helpful, please reply when you do