Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson

Assuming you have close to the highest-quality MicroSD card you can get, it might be worth it to try an NVMe SSD. Although I haven’t tried this myself, I read on the NVIDIA forums that it was preferable to save video files to an external SSD. As snafu666 indicated, the frame drop might be due to the bandwidth problems with the microSD card.

If I had the same problem as you, I would rig up an NVMe SSD as it is a cheap and easy test. I don’t know if @snafu666 was booting the Linux OS from the NVMe SSD or if he used it only as the media storage device on the Xavier NX.

The NVMe storage seems like good bang for the yen.

1 Like

you could also reduce your resolution and framerate to see if at least the frame drop goes away. you can also get started with videotestsrc to try help isolating the issue.

2 Likes

Step 1: verify usb device ID

with the camera plugged into the USB port of your Linux computer and the “live” LED on, what does lsusb show?

craig@craig-desktop:~$ lsusb
...
Bus 003 Device 003: ID 8087:07dc Intel Corp. 
Bus 003 Device 011: ID 05ca:2715 Ricoh Co., Ltd RICOH THETA Z1
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

Does it show the ID as 0x2715 (Z1)?

Step 2: verify kernel module

If so, do lsmod and verify that you have the v4l2loopback module loaded into the kernel.

$ lsmod
Module                  Size  Used by
...
v4l2loopback           37383  0
...

Step 3: verify device

Finally show the output of

v4l2-ctl --list-formats-ext --device  /dev/video1

The output will look like this with different Size value:

ioctl: VIDIOC_ENUM_FMT
    Type: Video Capture

    [0]: 'YU12' (Planar YUV 4:2:0)
        Size: Discrete 1920x960
            Interval: Discrete 0.033s (30.000 fps)

This is great advice on tracing the problem. Thanks for taking the time to share you experience with the community. Have a nice day.

Thank you! for now I have not the v4l2loopback module loaded into the kernel. I am trying, but I failed!

I don’t know for sure, but there is a note about secure boot enabled kernel on the v4l2loopback README.

v4l2loopback/README.md at main · umlaeute/v4l2loopback · GitHub

I’m just guessing as I can’t replicate the error you are showing.

now, v4l2loopback is installed on ubuntu18.04 in the form of DKMS see the following picture:


But I run “v4l2-ctl --list-formats-ext --device /dev/video0” no output!!!

is anything wrong with my driver?

What is the device ID of the Z1 you are using? Is it 2715? Show the output of lsusb, the secion with Ricoh Co., Ltd RIOCH THETA Z1.

Your first post showed that you were using /dev/video1 Do you only have one camera attached to your computer and the THETA Z1 is at /dev/video0?

Thank you for your help!
I use theta v. And my laptop no camera!!!. So when I plug the theta v to my computer, theta v is the only camera!


So the theta v is only camera, I change the codes of gst_viewer.cc:

This is tough as I can’t replicate the problem.

As a test, can you go into the bios of the laptop and disable the gtx2060 discrete GPU and use the onboard Intel Core i7-10875H integrated GPU using Intel UHD Graphics 630?


What’s working

You seem to have the correct device ID in streaming mode.
libuvc-theta-sample/thetauvc.c at f8c3caa32bf996b29c741827bd552be605e3e2e2 · ricohapi/libuvc-theta-sample · GitHub

manifold test system

for future reference

Ubuntu18.04
cpu:i7-10875H
RAM:16G
GPU:gtx2060
confirmed that laptop does not have a integrated webcam on /dev/video0.


If you’re still stuck

There’s another possible path for you to solve this. However, I have not tested the solution below yet. If you test it, please report back.

A community member has developed a GStreamer plug-in for the THETA.

If OpenCV is built with gstreamer backend enabled, VideoCapture::open() accepts the gstreamer pipeline description as its argument like

  VideoCapture cap;
  cap.open("thetauvcsrc ! h264parse ! decodebin ! videoconvert ! appsink");

opencv/cap_gstreamer.cpp at 1f726e81f91746e16f4a6110681658f8709e7dd2 · opencv/opencv · GitHub

you can capture image directly from gstreamer.

Please note that the standard image format on OpenCV is “BGR” which is not supported by most hardware assisted colorspace converter plugins, thus, you have to use software converter.

Since recent OpenCV accepts I420 or NV12 as input format for VideoCapture, you can capture without color conversion and convert using OpenCV if necessary.

If you decide to pursue this, it’s possible that gstreamer could be used with the ROS OpenCV camera driver if OpenCV is built with the gstreamer backend enabled.

  • Makefile in the gstthetauvc does not have “install” target. Please copy the plugin file (gstthetauvc.so) into appropriate directory. If you copied it into directory other than gstreamer plugin directory, you have to set the directory to the GST_PLUGIN_PATH environment variable.

  • The plugin has several properties to specify resolution and THETA to use(if multipe THETA are connected to the system).
    Run “gst-inspect-1.0 thetauvcsrc” for detail.

  • For Ubuntu 20 and Jetson users, OpenCV packages from Ubuntu official or nvidia have gstreamer backend.

  • For Ubuntu 18 users, official OpenCV packages do not have gstreamer backend and the version(3.2) is too old to use with the gstreamer. If you want to use on Ubuntu 18, build the latest OpenCV from the source, or AT YOUR OWN RISK, you can use unofficial OpenCV binary packages for Ubuntu 18 like

opencv-4.2 : “cran” team

Update on power supply to Z1 from USB hub.

A community member let me know that his test showed that for live streaming with the Z1, USB3 with Linux may be limited to 400mA to 450mA when live streaming with the Z1. This would result in a slow decrease in the Z1 battery if you’re streaming at 4K. This limitation of the power draw may be specific to the Z1 and Linux USB3 port.

The solution is to used a powered hub.

I’m still trying to test this myself.

Thank you craig. I use the theta v for coloring the point cloud of vlp16 of Velodyne. So, I need get the pictures of theta v every several miliseconds and save them in a directory in a jpeg or any other format.
Is the gstreamer can do this job?

Hi, I am not that experienced with gstreamer myself. I think other people on this forum have a similar application to you. I hope that other people can help test the gstreamer plugin for the theta and post recipes or examples of use.

This pipeline was contributed by @snafu666 using lossless huffman encoding on an nvidia jetson using the loopback. The input portion would needed to be modified.

gst-launch-1.0 v4l2src device=/dev/video99 ! video/x-raw,framerate=30/1 \
! videoconvert \
! videoscale \
! avenc_huffyuv \
! avimux \
! filesink location=raw.hfyu

I think we can use MTP protocol to get photoes from theta v. This maybe the easy way to solve the problem.

1 Like

Thanks for posting about libmtp. I hadn’t seen that project before. I was previously using libptp2 and libgphoto2.

Update on Power Supply for Live Streaming on NVIDIA Jetson and Linux x86

Workarounds for low current problem on Linux USB3 port.

If uvc device driver(uvcvideo) is loaded as a kernel module and you don’t use uvc device other than THETA,
disable uvc driver by adding uvc device driver to the driver blacklist.
e.g.

   sudo sh -c "echo blacklist uvcvideo >> /etc/modprobe.d/blacklist.conf"

or
Before connecting the THETA, disable autosuspend feature of the USB driver by writing -1 to the

/sys/module/usbcore/parameters/autosuspend.

e.g.

  sudo sh -c "echo -1 > /sys/module/usbcore/parameters/autosuspend"

Community member reports that Intel PC and Jetson Nano work fine.

Hi everyone, I have a critical issue when trying to launch v4l2loopback with 2 Theta V (On my Jetson AGX Xavier).
I build the kernel and instanciate 2 video devices, and when I launch the first camera everything s fine, the second one only works one time over 5. it shows “can’t open theta”. I have to deplug and plug again several times to make them work in parallel… Any ideas please why this happens ?

thank you in advance.

This is difficult for most people to test because they need two cameras and have to modify the C code. I haven’t tested this yet. However, I have seen a demo of it working. I don’t have a solution yet.

I noticed that there is a new commit to gstthetauvc

image

It’s possible, it handles multiple devices.

image