THETA X reducing latency for livestreaming

I am trying to use the ricoh theta X for streaming for robotics applications.
I would like to explore the best way to reduce latency in streaming

the following options I have tried:
(A) wifi streaming via getLivePreview @ 1024x512 resolution + jpeg decoding on the cpu
(B) USB streaming @ 2K + gpu H264 decoding

I have measured ~400ms latency for (A) and ~250ms for (B)

I have been trying to figure out what exactly is contributing to the latencies.
By measuring latency using the screen on theta x itself, I believe we can infer a lower bound possible for latency. I have done this both with and without stitching


I am happy to report that in both cases I measure ~150ms latency. I speculate the latencies beyond ~150ms observed are due to ondevice encoding, data transfer, decoding


I also measure the similar latency with and without stitching for (A). This seems to suggest that on device stitching latency is negligible, or stitching overhead is still being incurred even when stitching is switched off. (150ms for sensor to screen latency seems surprisingly long)

I am unable to perform this experiment for (B) as the _imagestitching config does not seem to apply to livestreaming via usb

unfortunately I can’t do a real apples to apples comparison between (A) and (B) since I am unable to stream at the same resolution in both cases on the theta X. Therefore I can’t infer the latency contributed by data transfer. I would expect data transfer over usb to be much faster than data transfer over wifi. I observer much higher variance in latency with streaming over wifi as well.

Therefore, the next thing I would like to try is to stream 2K raw over USB. at 30fps I would estimate that to be ~1.3Gbps for raw rgb, and half that for I420, which is well within bandwidth for USB3.0 (5~10Gbps). I would also expect the device to heat up less quickly without having to encode the stream. I believe I can get the raw stream within a plugin, but would not be able to stream to another machine via USB. is there a way to extend the USB API to allow raw uvc streaming? Motion JPG encoding is probably a good option as well.

side note:
I have seen latency for (B) hit 300ms quite often but i’m not sure when it happens (perhaps when the device is running low on battery life),
I test with 2 different pipelines appsrc name=ap ! queue ! h264parse ! queue ! nvdec ! gldownload ! autovideoconvert ! autovideosink sync=false and appsrc name=ap ! queue ! h264parse ! queue ! nvdec ! glimagesink and do not observe much difference, likely due to the compute characteristics on my machine (it’s a gaming PC)

2 Likes

hi,
in camera encoding could help if would be able to use SRT or WebRTC for delivery in theory. I’m building a live streaming plugin and experimenting a lot. Where did you try to play the video, I guess same network, which software and protocol was used?

Regards

hi biviel, my machine was connected to the thetaX directly over the same network for test setting (A). I use this API camera.getLivePreview and stream over http.

I am able to guarantee that the camera to be connected to a machine(for additional compute) for my application so I believe data transmission via USB would be my best bet.

I’m using camera API, not using webapi in my plugin and I implemented, experimenting with SRT protocol already. I’m wondering how much would be the latency if SRT is used, but clearly, you would need to install a mini SRT server to be able to view it. So camera is connecting to a desktop machine or tiny computer like Raspberry Pi?

It’s possible that the video pipeline enabled with the normal video or webapi still allocates time for stitching even if the video is not stitched. Using the WebAPI, I don’t think the time between still images vary if the stitching is turned off on the image.

See this test for still images.

The time is reduced slightly with the CameraAPI inside a plug-in.


If you want to try a different network setups, you can also run TCP/IP over Ethernet. It may run with Laszlo’s plug-in


I believe most people are using the USB cable with the uvc Linux driver, not Ethernet.

3 Likes

Just a note, my plugin does not support Theta X yet… I was just curious about the requirements, but planning to support Theta X too, once I get there.

OK. The Ethernet will only work on the X.

thanks for telling us about the ethernet interface of theta X.

do we need to switch on wifi and set the device to client mode before connecting the device to the router? I could not get the LAN icon to show up on my device and I’m not sure if it’s because I have to use too many crappy adapters. I don’t have a powered usb c ethernet adapter on hand

In my tests, I did not set the THETA X in client mode. I just plugged the Ethernet adapter into the THETA X and it worked with no configuration.

I am using the Ethernet adapter below. I plan to test it more. For the input power tot he adapter and passthrough to the camera, I am using a MacBook Air USB-C power supply. Most power supplies I have don’t work.

Ricoh Theta over Ethernet

If I understand correctly, using the USB-C Ethernet adapter would enable creation of Ethernet network over USB, then I can connect to the camera via Web API over HTTP with the default IP address?

or does it only work with client mode?

The THETA X is assigned an IP address from your DHCP server (like your office router). There is no wifi in use. We have not tested the THETA X as the DHCP server itself. That would likely require a plug-in.

There are no tests with the THETA X assigning an IP address to something like a Raspberry Pi. The THETA X only received an IP address.

In version 1.20.0, there was a limitation of having to use digest authentication for the API commands. In fw 1.30.0, you can disable the digest authentication so that you can send commands directly.

In fw 1.30.0, there is a limitation of not being able to set a static IP address. We hope this limitation will be resolved in future versions.

Unrelated to this, fw 1.30.0 also has the limitation of having to specify startPosition when grabbing the detail from camera.listFiles. We also hope this will be addressed in the next fw upgrade.

I wanted to update here that I got a plug-in running, which just triggers the preview callback

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        // do something here
        mCamera.addCallbackBuffer(data);
    }

these are the setting I use

p.set(RIC_PROC_STITCHING,         "RicStaticStitching");
p.set(RIC_PROC_ZENITH_CORRECTION, "RicZenithCorrectionOff");
p.set(RIC_EXPOSURE_MODE, "RicAutoExposureP");
int fps = 15;
p.setPreviewFrameRate(fps);
p.setPreviewSize(1920, 960);

I also turn off the lcd.

In this setting I observer a ~120ms latency from sensor to screen. I am hopeful that i can achieve a 150m latency if I can transmit the raw content to another compute unit.

unfortunately, I am unable to run the plugin for more than 0.5 hrs, before it shuts down as the device gets too hot
whereas via usb streaming 2K@15fps, I have gotten the theta X to stream a good 3 hours.

is this expected? why is this so? I tested this on both 1.20 and 1.30 firmware

1 Like

Have you considered using Z1 instead? For now I see way too much heat issue at X. Im able to live stream much longer via Z1 and Im going to test lateny. How did you measure? Thanks

hello, I’m going to try my best to make the device I have on hand work, since the Z1 is pretty expensive.

do you live stream via wifi or usb? I’d be grateful if you can share some metrics here comparing the 2 models

measuring latency: I point the camera at a stopwatch on the monitor connected to my computer. as I stream back to my computer, I display the image streamed by the device. I take a screenshot, capturing both the stopwatch, and what the device ‘sees’ (which contains the stop watch as well). so I see 2 stopwatches in the screenshot. the time delta between the 2 stopwatches is the latency

1 Like

Hi,
my plugin is using WiFi on Z1. In theory If I’m able to make it compatible with Theta X, it could consume internet through ethernet/usb, like @craig suggested… but not yet there.

In my plugin, I’m using RTMP and RTSP protocols exposed to push live streams to such servers. BUT I also implemented SRT protocol too. I did not measure latency yet in local network as I’m pushing to global networks, internet. SRT is a low latency protocol, where quality is also important, for me latency is also important, but not so much as for you seems.

I will make the plugin work in SRT mode too, also WebRTC is on my radar, which could be a great option for you, however I’m not sure about timeline, where would this become useful for you on X. I’m also waiting for some feedback from Ricoh as having issues to make my plugin compatible and run on Theta X.

I will come back to you once I’m able to do some tests about latency in local network. It’s still not clear for me, where on which device you are watching the live preview, which software tools are you using and hardware?

@biviel , can the HDR Wireless Live Streaming plug-in use a bluetooth microphone like in the video below?

@craig I’ve tried quite a few things the past couple of weeks and I summarize them below

Our use case is in robotics, where we would like to use a 360 camera as the ‘eyes’ of the robot. There will be a compute unit connected to the thetax, and issuing controls to the robot. here is a video that illustrates what we do, in case of interest. I am happy to explain in more detail, if it helps with your understanding.

requirements:

  • resolution at 2K
  • framerate 15
  • latency as low as possible
  • operating time 4hrs (half a workday)

ideas tried/considered so far:
A) 2K streaming via wireless is not possible with the theta x, I have streamed 1K, and find the latencies to be quite unstable
B) 2K streaming with a plugin causes the device to heat up too quickly
C) 2K streaming via LAN is not possible as the accompanying compute unit will be mobile. I have been playing with adb shell, but have not been able to assign an ip to the ethernet port for a connection to the compute unit.
D) 2K streaming via USB is the most promising by far. by removing gstreamer and calling decoding apis directly at each uvc callback I can get latencies of about ~220ms. (I estimate decoding latency at about ~25ms) I was able to run a 4hr soak test in my home office, but i’m not confident the device will not overheat under the sun. 220ms is quite high compared to ~60ms from a realsense camera. while I believe it is reasonable to pay compute for the stitching and fov, I believe there is room for improvement.

note about latency:
For smooth realtime control of a robot, we will need each iteration from sensor input to robot receiving controls to be on the order of human reaction time. this will consist of:

  • camera latency
  • camera → compute unit data transfer
  • data decoding
  • further processing to get controls (I am greatly over simplifying this)
  • send controls to robot

The reason we are very motivated to bring down camera latency is because it allows us to adopt more advanced algorithms for robot control. 100ms latency is typical of an inference run for deep models highly optimized for embedded systems

since I have measured ~120ms sensor to screen latency in a plugin, I believe there is a good ~75ms of latency spent on encoding and data transfer. Considering the bandwidth of USB connections, there is actually no need for data compression. a 2K I420 image is 22Mbits and a USB2.0 bandwidth can accomodate up to 20 frames a second. USB3.0 bandwidth would be plenty more.


*Therefore my request to ricoh devs is to extend USB streaming for theta X to support 2K RAW format to bring down latency. fps of 15 to manage heat. *
Please let me know if more information is needed.


Reference:
USB 2.0 bandwidth 480 Mbit/s
USB 3.0 bandwidth 5 Gbit/s

single 2K I420 memory footprint (1920*960*3/2*8) = 22.118Mbits
single 2K RGB memory footprint (1920*960*3*8) = 44.236Mbits

1 Like

Thank you for the detailed test and information.

I’ll work with @jcasman to get this into the weekly and monthly reports we send to RICOH.

In addition to the compression, there is also the time for stitching. Do you need the stream in equirectangular format?

Realistically, there is unlikely to be a short-term fix for the latency issue. We can pass on the information for consideration in future camera updates.

Is this requirement for continuous streaming with power to the camera supplied from the robot using a USB cable?

We are only interested in streaming in equirectangular format for the goal of lowering latency. i.e. the stitching on the device can’t still be running(doesn’t seem to be the case in my tests). I am open to implementing the stitching on a separate compute unit.

Is this requirement for continuous streaming with power to the camera supplied from the robot using a USB cable?

yes that is right, you can assume that we always start on a full charge

Thank you for your information.

You’re doing cutting edge work. The latency may be a problem that can’t be addressed in the short-term.

Previous projects

Lockheed Martin Amelia Drone - RIT student research project

  • project home

  • discussion topic on this forum

  • 1920x960 8fps using motionJPEG

  • (~250 ms at 1920x960 @ 8fps at 0.25 miles, ~100 ms at 1024x512 @ 30fps at 0.25 miles). NOTE. I did not verify the ~100ms latency at 1024x512 described in this post and I’m not sure if it was measured.

TwinCam Go

FOX Sewer Rover


available motionjpeg livePreview formats

previewFormat