THETA X reducing latency for livestreaming

It’s possible that the video pipeline enabled with the normal video or webapi still allocates time for stitching even if the video is not stitched. Using the WebAPI, I don’t think the time between still images vary if the stitching is turned off on the image.

See this test for still images.

The time is reduced slightly with the CameraAPI inside a plug-in.


If you want to try a different network setups, you can also run TCP/IP over Ethernet. It may run with Laszlo’s plug-in


I believe most people are using the USB cable with the uvc Linux driver, not Ethernet.

3 Likes

Just a note, my plugin does not support Theta X yet… I was just curious about the requirements, but planning to support Theta X too, once I get there.

OK. The Ethernet will only work on the X.

thanks for telling us about the ethernet interface of theta X.

do we need to switch on wifi and set the device to client mode before connecting the device to the router? I could not get the LAN icon to show up on my device and I’m not sure if it’s because I have to use too many crappy adapters. I don’t have a powered usb c ethernet adapter on hand

In my tests, I did not set the THETA X in client mode. I just plugged the Ethernet adapter into the THETA X and it worked with no configuration.

I am using the Ethernet adapter below. I plan to test it more. For the input power tot he adapter and passthrough to the camera, I am using a MacBook Air USB-C power supply. Most power supplies I have don’t work.

Ricoh Theta over Ethernet

If I understand correctly, using the USB-C Ethernet adapter would enable creation of Ethernet network over USB, then I can connect to the camera via Web API over HTTP with the default IP address?

or does it only work with client mode?

The THETA X is assigned an IP address from your DHCP server (like your office router). There is no wifi in use. We have not tested the THETA X as the DHCP server itself. That would likely require a plug-in.

There are no tests with the THETA X assigning an IP address to something like a Raspberry Pi. The THETA X only received an IP address.

In version 1.20.0, there was a limitation of having to use digest authentication for the API commands. In fw 1.30.0, you can disable the digest authentication so that you can send commands directly.

In fw 1.30.0, there is a limitation of not being able to set a static IP address. We hope this limitation will be resolved in future versions.

Unrelated to this, fw 1.30.0 also has the limitation of having to specify startPosition when grabbing the detail from camera.listFiles. We also hope this will be addressed in the next fw upgrade.

I wanted to update here that I got a plug-in running, which just triggers the preview callback

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        // do something here
        mCamera.addCallbackBuffer(data);
    }

these are the setting I use

p.set(RIC_PROC_STITCHING,         "RicStaticStitching");
p.set(RIC_PROC_ZENITH_CORRECTION, "RicZenithCorrectionOff");
p.set(RIC_EXPOSURE_MODE, "RicAutoExposureP");
int fps = 15;
p.setPreviewFrameRate(fps);
p.setPreviewSize(1920, 960);

I also turn off the lcd.

In this setting I observer a ~120ms latency from sensor to screen. I am hopeful that i can achieve a 150m latency if I can transmit the raw content to another compute unit.

unfortunately, I am unable to run the plugin for more than 0.5 hrs, before it shuts down as the device gets too hot
whereas via usb streaming 2K@15fps, I have gotten the theta X to stream a good 3 hours.

is this expected? why is this so? I tested this on both 1.20 and 1.30 firmware

1 Like

Have you considered using Z1 instead? For now I see way too much heat issue at X. Im able to live stream much longer via Z1 and Im going to test lateny. How did you measure? Thanks

hello, I’m going to try my best to make the device I have on hand work, since the Z1 is pretty expensive.

do you live stream via wifi or usb? I’d be grateful if you can share some metrics here comparing the 2 models

measuring latency: I point the camera at a stopwatch on the monitor connected to my computer. as I stream back to my computer, I display the image streamed by the device. I take a screenshot, capturing both the stopwatch, and what the device ‘sees’ (which contains the stop watch as well). so I see 2 stopwatches in the screenshot. the time delta between the 2 stopwatches is the latency

1 Like

Hi,
my plugin is using WiFi on Z1. In theory If I’m able to make it compatible with Theta X, it could consume internet through ethernet/usb, like @craig suggested… but not yet there.

In my plugin, I’m using RTMP and RTSP protocols exposed to push live streams to such servers. BUT I also implemented SRT protocol too. I did not measure latency yet in local network as I’m pushing to global networks, internet. SRT is a low latency protocol, where quality is also important, for me latency is also important, but not so much as for you seems.

I will make the plugin work in SRT mode too, also WebRTC is on my radar, which could be a great option for you, however I’m not sure about timeline, where would this become useful for you on X. I’m also waiting for some feedback from Ricoh as having issues to make my plugin compatible and run on Theta X.

I will come back to you once I’m able to do some tests about latency in local network. It’s still not clear for me, where on which device you are watching the live preview, which software tools are you using and hardware?

@biviel , can the HDR Wireless Live Streaming plug-in use a bluetooth microphone like in the video below?

@craig I’ve tried quite a few things the past couple of weeks and I summarize them below

Our use case is in robotics, where we would like to use a 360 camera as the ‘eyes’ of the robot. There will be a compute unit connected to the thetax, and issuing controls to the robot. here is a video that illustrates what we do, in case of interest. I am happy to explain in more detail, if it helps with your understanding.

requirements:

  • resolution at 2K
  • framerate 15
  • latency as low as possible
  • operating time 4hrs (half a workday)

ideas tried/considered so far:
A) 2K streaming via wireless is not possible with the theta x, I have streamed 1K, and find the latencies to be quite unstable
B) 2K streaming with a plugin causes the device to heat up too quickly
C) 2K streaming via LAN is not possible as the accompanying compute unit will be mobile. I have been playing with adb shell, but have not been able to assign an ip to the ethernet port for a connection to the compute unit.
D) 2K streaming via USB is the most promising by far. by removing gstreamer and calling decoding apis directly at each uvc callback I can get latencies of about ~220ms. (I estimate decoding latency at about ~25ms) I was able to run a 4hr soak test in my home office, but i’m not confident the device will not overheat under the sun. 220ms is quite high compared to ~60ms from a realsense camera. while I believe it is reasonable to pay compute for the stitching and fov, I believe there is room for improvement.

note about latency:
For smooth realtime control of a robot, we will need each iteration from sensor input to robot receiving controls to be on the order of human reaction time. this will consist of:

  • camera latency
  • camera → compute unit data transfer
  • data decoding
  • further processing to get controls (I am greatly over simplifying this)
  • send controls to robot

The reason we are very motivated to bring down camera latency is because it allows us to adopt more advanced algorithms for robot control. 100ms latency is typical of an inference run for deep models highly optimized for embedded systems

since I have measured ~120ms sensor to screen latency in a plugin, I believe there is a good ~75ms of latency spent on encoding and data transfer. Considering the bandwidth of USB connections, there is actually no need for data compression. a 2K I420 image is 22Mbits and a USB2.0 bandwidth can accomodate up to 20 frames a second. USB3.0 bandwidth would be plenty more.


*Therefore my request to ricoh devs is to extend USB streaming for theta X to support 2K RAW format to bring down latency. fps of 15 to manage heat. *
Please let me know if more information is needed.


Reference:
USB 2.0 bandwidth 480 Mbit/s
USB 3.0 bandwidth 5 Gbit/s

single 2K I420 memory footprint (1920*960*3/2*8) = 22.118Mbits
single 2K RGB memory footprint (1920*960*3*8) = 44.236Mbits

1 Like

Thank you for the detailed test and information.

I’ll work with @jcasman to get this into the weekly and monthly reports we send to RICOH.

In addition to the compression, there is also the time for stitching. Do you need the stream in equirectangular format?

Realistically, there is unlikely to be a short-term fix for the latency issue. We can pass on the information for consideration in future camera updates.

Is this requirement for continuous streaming with power to the camera supplied from the robot using a USB cable?

We are only interested in streaming in equirectangular format for the goal of lowering latency. i.e. the stitching on the device can’t still be running(doesn’t seem to be the case in my tests). I am open to implementing the stitching on a separate compute unit.

Is this requirement for continuous streaming with power to the camera supplied from the robot using a USB cable?

yes that is right, you can assume that we always start on a full charge

Thank you for your information.

You’re doing cutting edge work. The latency may be a problem that can’t be addressed in the short-term.

Previous projects

Lockheed Martin Amelia Drone - RIT student research project

  • project home

  • discussion topic on this forum

  • 1920x960 8fps using motionJPEG

  • (~250 ms at 1920x960 @ 8fps at 0.25 miles, ~100 ms at 1024x512 @ 30fps at 0.25 miles). NOTE. I did not verify the ~100ms latency at 1024x512 described in this post and I’m not sure if it was measured.

TwinCam Go

FOX Sewer Rover


available motionjpeg livePreview formats

previewFormat

Is there any update here, I am doing the similar thing by remote streaming 360 video to a meta quest headset to control a Robot. Theta X is the only in-device stitching around 1k camera. We don’t want to try insta360 pro which is too expensive or build ourselves’ by stitching multiple cameras. I haven’t got a chance to try rtsp plugin, before my developer registration approved.
What I am doing now is live streaming by usb cable connecting to linux desktop, and ffmpeg to rtsp server and webrtc it to remote. By this way we can do a 2K video with about 2s latency, which is too high for the runtime operation.

you can likely get lower than 2s latency. However, I don’t think you’ll be able to adequately control a flying drone unless the drone has some level of autonomous flying capability.

The example below uses the Janus WebRTC Server

Thanks for your reply.
I kind of figured out some clue on Theta x. When I connect it to a charger and wirelessly stream to my rtmp server, streaming can last 9 mins with as low as 200ms latency. And then, it will get overheat, plugin will crash and need to be restarted, and latency will go up to 1-2 seconds.

If I put it under sun shine, I don’t know it will work for 3mins or less.
Theta X is my only theta now, I am wondering is any of the V or Z better than X on my case? Is it able to stream 360 video more than 1hour without crash on V or Z?

The Z1 has a higher thermal shutdown. The V is discontinued, but may also work if you can find a used one.

Do you have the most up-to-date firmware on the THETA X? There was some additional thermal management features added. The newest firmware is 2.21.0.