THETA X reducing latency for livestreaming

I am trying to use the ricoh theta X for streaming for robotics applications.
I would like to explore the best way to reduce latency in streaming

the following options I have tried:
(A) wifi streaming via getLivePreview @ 1024x512 resolution + jpeg decoding on the cpu
(B) USB streaming @ 2K + gpu H264 decoding

I have measured ~400ms latency for (A) and ~250ms for (B)

I have been trying to figure out what exactly is contributing to the latencies.
By measuring latency using the screen on theta x itself, I believe we can infer a lower bound possible for latency. I have done this both with and without stitching


I am happy to report that in both cases I measure ~150ms latency. I speculate the latencies beyond ~150ms observed are due to ondevice encoding, data transfer, decoding


I also measure the similar latency with and without stitching for (A). This seems to suggest that on device stitching latency is negligible, or stitching overhead is still being incurred even when stitching is switched off. (150ms for sensor to screen latency seems surprisingly long)

I am unable to perform this experiment for (B) as the _imagestitching config does not seem to apply to livestreaming via usb

unfortunately I can’t do a real apples to apples comparison between (A) and (B) since I am unable to stream at the same resolution in both cases on the theta X. Therefore I can’t infer the latency contributed by data transfer. I would expect data transfer over usb to be much faster than data transfer over wifi. I observer much higher variance in latency with streaming over wifi as well.

Therefore, the next thing I would like to try is to stream 2K raw over USB. at 30fps I would estimate that to be ~1.3Gbps for raw rgb, and half that for I420, which is well within bandwidth for USB3.0 (5~10Gbps). I would also expect the device to heat up less quickly without having to encode the stream. I believe I can get the raw stream within a plugin, but would not be able to stream to another machine via USB. is there a way to extend the USB API to allow raw uvc streaming? Motion JPG encoding is probably a good option as well.

side note:
I have seen latency for (B) hit 300ms quite often but i’m not sure when it happens (perhaps when the device is running low on battery life),
I test with 2 different pipelines appsrc name=ap ! queue ! h264parse ! queue ! nvdec ! gldownload ! autovideoconvert ! autovideosink sync=false and appsrc name=ap ! queue ! h264parse ! queue ! nvdec ! glimagesink and do not observe much difference, likely due to the compute characteristics on my machine (it’s a gaming PC)

hi,
in camera encoding could help if would be able to use SRT or WebRTC for delivery in theory. I’m building a live streaming plugin and experimenting a lot. Where did you try to play the video, I guess same network, which software and protocol was used?

Regards

hi biviel, my machine was connected to the thetaX directly over the same network for test setting (A). I use this API camera.getLivePreview and stream over http.

I am able to guarantee that the camera to be connected to a machine(for additional compute) for my application so I believe data transmission via USB would be my best bet.

I’m using camera API, not using webapi in my plugin and I implemented, experimenting with SRT protocol already. I’m wondering how much would be the latency if SRT is used, but clearly, you would need to install a mini SRT server to be able to view it. So camera is connecting to a desktop machine or tiny computer like Raspberry Pi?

It’s possible that the video pipeline enabled with the normal video or webapi still allocates time for stitching even if the video is not stitched. Using the WebAPI, I don’t think the time between still images vary if the stitching is turned off on the image.

See this test for still images.

The time is reduced slightly with the CameraAPI inside a plug-in.


If you want to try a different network setups, you can also run TCP/IP over Ethernet. It may run with Laszlo’s plug-in


I believe most people are using the USB cable with the uvc Linux driver, not Ethernet.

2 Likes

Just a note, my plugin does not support Theta X yet… I was just curious about the requirements, but planning to support Theta X too, once I get there.

OK. The Ethernet will only work on the X.

thanks for telling us about the ethernet interface of theta X.

do we need to switch on wifi and set the device to client mode before connecting the device to the router? I could not get the LAN icon to show up on my device and I’m not sure if it’s because I have to use too many crappy adapters. I don’t have a powered usb c ethernet adapter on hand

In my tests, I did not set the THETA X in client mode. I just plugged the Ethernet adapter into the THETA X and it worked with no configuration.

I am using the Ethernet adapter below. I plan to test it more. For the input power tot he adapter and passthrough to the camera, I am using a MacBook Air USB-C power supply. Most power supplies I have don’t work.