Now that I was able to do more tests, I can confirm that there are lot of aspect. Latency depends on bandwidth/bitrate too, so when video bitrate is lower, latency was lower too. During test we did there bandwidth and quality was lower. When I do a full 4 resolution, good quality stream latency is closer to 2 second end to end.
Now when I measured this latency on a WiFi 5 protocol 5ghz local network, I measured 800-900ms latency using 4k resolution and h.265 encoding when streaming to OBS Studio as Media Source.
Is this when the Z1 and the the computer running OBS Studio are in the same room? I wonder if the virtual cam of OBS adds additional latency when it gets to Unity? When I did the test with Unity, I did not check for latency.
I donāt necessarily have a maximum acceptable latency but less than a second would be preferred.
For now Iām using my laptopās hotspot which is only few feet away from the Z1 at most. My laptopās hotspot is the only source of WiFi I can use while at my college (Rowan University). The reason I canāt use my colleges WiFi is because I canāt connect the Z1 to it through client mode likely due to security reasonās with it being a college network, but if you have any possible ideas on how I could connect the Z1 to my college WiFi then itād be appreciated.
For now Iām using my laptopās hotspot which is only few feet away from the Z1 at most. My laptopās hotspot is the only source of WiFi I can use while at my college (Rowan University). The reason I canāt use my colleges WiFi is because I canāt connect the Z1 to it through client mode likely due to security reasonās with it being a college network, but if you have any possible ideas on how I could connect the Z1 to my college WiFi then itād be appreciated.
Yes, same room. In past, ~3 firmware upgrades back, before Ricoh optimized WiFi usage because of overheating, this latency was lower for me.
Now Iām back to test WebRTC again and to try to play in browser full 4k, as good quality as possible. Latency is about 300ms.
Here you can see my desktop. In a browser running a counter and in another browser in front of it, the live 4k preview of my Z1 streaming into it WebRTC:
When you use OBS on the same laptop that youāre using as the hotspot, what is the CPU load shown on OBS? Is it showing something reasonable like under 10%?
I didnāt measure the latency between the Z1 and Unity when using RTSP plugin, but 2 seconds seems high. Just using RTSP plugin at 4K, Iām getting under 500ms inside the same room and not going to Unity, just Z1 to computer only.
I believe the @Manuel_Couto has it working from the RTSP plugin to OBS and then to Unity and seems to have it under 2 second latency, right?
I promised an update yesterday about the wireless connection, but Iām still working on it.
For now, using WiFi and a 2.4GHz connection, I measured approximately 1.01 seconds latency for 1920x960 resolution and about 1.42 seconds for 3840x1920. (This is from Z1 to Unity)
Since I intend to use this for VR teleoperation and require very low latency, I have been studying using other protocols and implementing 5G.
Today, I tried to implement 5G by connecting the Z1 with an Ethernet cable to the router and then using 5G to connect to the PC, but Iām still encountering some issues.
I will keep you informed when I get it working.
PS: @craig, regarding the problem I was having with transitioning from the 2.4GHz to 5GHz band connection, it turned out to be an issue with the router. Today, I tried it out with another router, and it worked fine with 5GHz.
Also, please note that the tests above were done yesterday with the old router that was having problems, so they might not be the most trustworthy source of information. As soon as I get new results, I will post them.
You used RTSP Plugin, right? H264 encoding, 2k and 4k and I assume equirectangular projection.
Iām wondering, is equirectangular needed at all? With my plugin Iām able to stream double fisheye directly and it generates lower heat. Using my plugin and fisheye and 24FPS it would work for hours in a 26-28Ā°C environment, using SRT. Regarding latency, h.265 is important because it requires lower bandwidth, that also helps a bit with latency. Honestly I donāt know what is that RTSP plugin capable of.
Iām new to the world of cameras and protocol communication, so Iām still learning as I go.
Currently, Iām using the THETA RTSP Streaming plugin. Iāve tested it with both 2k and 4k resolutions. Iām not certain about the encoding itās using, nor am I sure how to check or change it.
As for the projection type, based on @craigās comment, I believe Iām using the standard equirectangular projection.
Yes, I capture the live feed with OBS, convert it into a Virtual Camera, and then project it onto a Unity Sphere, which I can view with the HTC Vive headset.
When using OBS on the same laptop as the hotspot, the CPU load shown in OBS varies from less than 1% to around 6%, so yes itās reasonable. At the moment weāre having another issue regarding using the camera with Unity. We can only enter play mode in Unity successfully when the Ricoh Theta Z1 is wired to the laptop and not connected wirelessly. When trying to use the wireless connection or trying to enter play mode without the camera connected at all, Unity crashes when trying to enter play mode. We (@Caleb_Amadoro) believe this is from the code we are currently using to attach the Ricoh Theta live video feed to a material on the surface of a sphere. Could you let us know what code youāre using for the sphere material or any other code that could potentially fix this issue?
Over the past few weeks, Iāve been focused on live streaming from my Z1 to a Unity app while aiming for minimum latency. Currently, Iāve achieved an average latency of 544 ms.
Hereās the setup:
A computer (PC1) connected via cable to the Z1. On this PC, OBS converts the video feed into a virtual camera. Subsequently, I access the virtual camera through a Unity app, that utilizes WebRTC to transmit to a second PC (PC2).
PC2 runs the client-side version of the ChatApp.
Both PCs are connected via Ethernet to 5G routers (PC1 is connected to R1, and PC2 is connected to R2).
For testing purposes, I compare the stopwatch time at four different points:
Upon analyzing the Excel sheet, itās evident that the major latency occurs during the capturing process (Z1 to OBS and OBS to Unity). Surprisingly, the transmission latency is only 60 ms.
Moving forward, I plan to focus on reducing the capturing latency as much as possible.
Is there any difference if you stream 2K instead of 4K? I was just wondering as the stitching time for each frame is consuming a significant portion of the time needed to get output from the camera over USB.
Can a Mac run Unity? You may be able to avoid the virtual camera with OBS if you use a Mac for a test. I have not tried.
@craig When using OBS on the same laptop as the hotspot, the CPU load shown in OBS varies from less than 1% to around 6%, so yes itās reasonable. At the moment weāre having another issue regarding using the camera with Unity. We can only enter play mode in Unity successfully when the Ricoh Theta Z1 is wired to the laptop and not connected wirelessly. When trying to use the wireless connection or trying to enter play mode without the camera connected at all, Unity crashes when trying to enter play mode. We (@Caleb_Amadoro) believe this is from the code we are currently using to attach the Ricoh Theta live video feed to a material on the surface of a sphere. Could you let us know what code youāre using for the sphere material or any other code that could potentially fix this issue?
This is what I have done in my code, in the start you check if you have any webCams connected, if not, you end the program.
// Gets the list of devices and prints them to the console.
private void Start()
{
WebCamDevice[] devices = WebCamTexture.devices;
for (int i = 0; i < devices.Length; i++)
Debug.Log(devices[i].name);
defaultBackground = pcImage.texture;
if (devices.Length == 0)
{
Debug.Log("No devices connected!");
pcCamAvaileble = false;
phoneCamAvaileble = false;
return;
}
pcCam = new WebCamTexture(devices[0].name, Screen.width, Screen.height);
pcCam.Play();
pcImage.texture = pcCam;
pcCamAvaileble = true;
if (devices.Length > 1)
{
phoneCam = new WebCamTexture(devices[1].name, Screen.width, Screen.height);
phoneCam.Play();
phoneImage.texture = phoneCam;
phoneCamAvaileble = true;
}
}
Ps: If you need to specify the camera, you can compare the camera name to a string. If it returns false, you end the program, or you can wait for a connection (be careful with infinite loopsā¦).
Regarding the 2K or 4K resolution, Iām uncertain. How can I verify if the camera is set to 2K or 4K?
All I did was adjust the capture resolution in OBS, but I believe this doesnāt affect the camera settings.
As for the Mac, Iām unsure. While Unity allows you to compile apps for almost every platform, Iām uncertain if you can run the Unity Editor on a Mac. Furthermore, I lack a Mac to conduct any tests (Iām a Google Fan ).
Note that although the document shows Unity running directly with a USB cable, something has changed in either Windows or Unity. When I tested Unity last year, I couldnāt get the stream to work unless I used the OBS virtual cam.
I seem to recall that the color format might impact Unity. Back when the driver was first written, I got some information from the developer of the driver regarding Unity.
With YUY2 color format output, you can use the driver with Unity and JavaScript Media API.
In addition to NV12 (native color format of the decoder), YUY2 color format is supported for Unity or JavaScript Media API.
Single driver provides all available image size/color format. For compatibility with non-standard Unity Webcam interface, default output is set to 3840x1920/YUY2. The default output is not the driverās preferred format.
The screenshot indicates a notable enhancement, with a reduction in latency of 163 ms, equating to a 38% improvement.
I came across an interesting observation when comparing the 2020 version with the Beta release: the time Unity takes to display the camera feed. Iāve termed this the āAverage Unity Display Time.ā Since I donāt have a direct means to measure this time, I estimated it by subtracting the average latency of the current method from the average latency I obtained from the Z1 to OBS.
My assumption was that OBS instantly displays the feed, so the measured time was the duration it took for the Z1 to capture the image (265 ms).
With this in mind, here are my findings:
In the 2020 version, Unity introduced 48 ms of latency.
In the Beta release, we get -3 ms. This can be considered an oscillation in measurements, suggesting that Unity didnāt introduce any additional latency compared to OBS.
Iām planning to delve deeper into the issue of resolution next, as it seems that the capturing time is the primary contributor to latency (around 260 ms).
Iām curious if thereās a method to set the output resolution of the Z1 to either 1920 or 3840 directly. While it was suggested that this could be achieved through OBS, Iām exploring alternative approaches since Iām no longer utilizing OBS.