When define camNam= " RICOH THETA Z1 4K" I got this error: Couldn’t find a supported mediatype for the selected Virtual Camera
When define camName = “RICOH THETA Z1”; I got this error:
Could not connect pins - RenderStream()
Here is my code
public class Get360Camera : MonoBehaviour
{
static WebCamTexture Camera;
string camName = "RICOH THETA Z1 FullHD"; // Name of your camera.
public Material camMaterial; // Skybox material
void Start()
{
WebCamDevice[] devices = WebCamTexture.devices;
//printing out all video sources for debugging purposes
Debug.Log("Number of web cams connected: " + devices.Length);
for (int i = 0; i < devices.Length; i++)
{
Debug.Log(i + " " + devices[i].name);
}
if (Camera == null)
Camera = new WebCamTexture(camName, 3840, 1920); // Resolution you want
if (!Camera.isPlaying)
Camera.Play();
if (camMaterial != null)
camMaterial.mainTexture = Camera;
}
}
I’ve been attempting to connect my Z1 camera with Unity, but unfortunately, I’ve hit a snag.
I’m currently using Unity version 2022.3.17f1 along with the 3.0.0 UVC driver.
Interestingly, I’ve had no issues connecting the camera to OBS. However, whenever I attempt to do the same in Unity, the program crashes without displaying any error messages. I’ve tested my code with various other cameras, and it seems to function properly.
I’ll provide the code below for reference:
using UnityEngine;
public class getRicohStream : MonoBehaviour
{
void Start()
{
WebCamDevice[] devices = WebCamTexture.devices;
//printing out all video sources for debugging purposes
Debug.Log("Number of web cams connected: " + devices.Length);
for (int i = 0; i < devices.Length; i++)
{
Debug.Log(i + " " + devices[i].name);
}
Renderer rend = this.GetComponentInChildren<Renderer>();
WebCamTexture mycam = new();
string camName = devices[0].name;
mycam.deviceName = camName;
Debug.Log("The camera name is: " + camName);
rend.material.mainTexture = mycam;
mycam.Play();
}
}
First and foremost, thank you for your prompt response!
As I was perusing the forum earlier today in search of a solution, I did indeed come across the topic you mentioned. Utilizing OBS as a virtual camera allowed me to integrate the Z1 with Unity.
However, having just completed my testing, I’m uncertain whether the lower image quality and darker appearance stem from settings within the OBS virtual camera or within Unity itself. Notably, the image quality I obtain directly from OBS is significantly better.
Nevertheless, I intend to continue exploring this solution further. In the meantime, I’d like to inquire whether anyone has discovered an alternative method to bypass the OBS virtual camera or has insights into why Unity crashes when directly connected.
I’m not that familiar with Unity and don’t know if there’s a way to increase the brightness inside of Unity. I do not believe the brightness of the Z1 can be adjusted for the livestream within the camera settings.
@biviel has a plugin that can output RTSP. It may be improve the Unity connection as he’s actively developing it and may be able to adjust the stream characteristics from the camera. You may need to use this VideoLAN / vlc-unity · GitLab (free if you build from source) to get the RTSP into Unity. (untested)
A member in the community tested the THETA with C#. We have an unconfirmed theory that using AForge may work. I personally have not tried it. Also, to be honest, I don’t know Unity well enough to assess if the technology would help.
However, I discovered that the issue was with Unity. After some testing, I realized that I was using the wrong type of material, Lit, instead of Unlit. Additionally, I encountered a peculiar situation where I had to choose “Universal Render Pipeline/Unlit” rather than the standard Unlit option to render the inside of the sphere correctly, despite flipping the normals.
Here are some screenshots comparing the Before (Lit Material) with the After (Unlit Material):
I also want to share another observation. Initially, when I projected the 360º image onto the sphere, I noticed some distorted lines, especially around the plinth. I experimented with changing the shape of the sphere, making it more oval or even flattening some sides, but these adjustments didn’t yield the desired results. What proved to be more effective in reducing distortion was adjusting the size of the sphere. Ultimately, I settled on a diameter of 1f. While I haven’t yet determined a direct correlation between sphere size and lens type, through trial and error, I found that this size minimized distortion the most.
In the images I used to showcase the difference between Lit and Unlit materials, one can also observe the disparity in distortion. In the first image with the Lit material, the distortion in the lines is much clearer, whereas in the second image with the Unlit material, the distortion was significantly reduced.
You did this by connecting z1 to obs via usb as webcam and then from obs which format (rtno, webrtc, etc?) and encoding and resolution was used to stream to quest 2? In quest you were running a unity app?
Wow! Thanks for sharing these tips. I’m sure these will help other people. As I mentioned, I’m not that experienced with Unity and didn’t know about the unlit versus lit material.
Additionally, I encountered a peculiar situation where I had to choose “Universal Render Pipeline/Unlit” rather than the standard Unlit option to render the inside of the sphere correctly, despite flipping the normals.
Are you using the skybox as suggested by RicoB?
We were originally using a sphere with flip normals. However, most people started using the skybox technique to avoid the vertices.
This is really cool! Thanks for posting. I don’t use Unity with THETA much but I did a VR demo at DeveloperWeek several years ago and I remember us having I believe the exact same issue. Thanks for the information!
How I achieved live feed visualization from Theta Z1 in Unity using two different methods:
First Method - USB:
Installed the UVC Driver.
Connected the camera to the PC via USB.
Set up the camera source in OBS as Video Capture Device.
To pass the live feed to Unity, I utilized the virtual camera feature in OBS, as described in my initial post.
Finally, the image is projected onto a sphere with inverted normals (I used an Unlit texture for the appropriate coloration and brightness).
Second Method - RTSP Connection (implemented this afternoon):
Installed the Theta RTSP Streaming plug-in on the Z1.
Connected the Z1 via CL (currently tested on 2.4GHz).
Identified the camera’s IP.
In OBS, selected Media Source as the source and input the following URL: rtsp://user:pass@camIP:8554/live?resolution=1920x960.
(We can use diferent resolutions as stated in: THETA RTSP Streaming)
To stream to Unity, I’m still using the Virtual cam.
In terms of virtual reality, I’m using the HTC Vive Pro 2 kit in conjunction with Steam VR.
Currently, due to a considerable delay (I haven’t tested to determine the exact value yet), I’m attempting to use the 5GHz connection, but I’m not succeeding because the camera is not able to connect to the router. Does anyone has any ideas regarding this aspect?
I see, I wonder how much better would it look like in HDR mode and h.265 encoding. 1920x960 seems a bit low for a HMD device.
Are you familiar with VLC-Unity? I’m aiming to directly to stream into headsets, without need of a PC and OBS Studio, using SRT stream directly of the plugin I developed. So in theory could stream to Unity on Meta Quest 2/3 but I assume other Unity compatible devices should work. Grabbing HDR preview of the Z1 in camera makes a huge difference in image quality and adjusting brightness, etc. in camera. Of course plugin also works to stream to OBS as capture device wirelessly.
So Unity is running on a high end PC and HTC Vive Pro 2 is used as display connecting with wires?
In the old demo from June 2018, we had the HTC Vive connected to a Windows PC with wires. Back then (5 or 6 years ago), we needed a gaming laptop with discrete mid-range GPU card (higher end for laptops) for demos. In 2024, think that a normal laptop can handle things
I have not tried VLC-Unity yet. It’s a library, so we would need some sample code to get the stream into Unity.
Personally, I’m using a laptop with 32 Gb of ram, a Ryzen 9 7945HX and an RTX 4060, and I’ve had no problems.
In terms of the HTC Vive, (I hope I’m not talking rubbish), but from what I know, I always need an auxiliary computer to run the application. However, PC - Headseat communication is wireless.