Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson

I don’t know for sure, but there is a note about secure boot enabled kernel on the v4l2loopback README.

v4l2loopback/README.md at main · umlaeute/v4l2loopback · GitHub

I’m just guessing as I can’t replicate the error you are showing.

now, v4l2loopback is installed on ubuntu18.04 in the form of DKMS see the following picture:


But I run “v4l2-ctl --list-formats-ext --device /dev/video0” no output!!!

is anything wrong with my driver?

What is the device ID of the Z1 you are using? Is it 2715? Show the output of lsusb, the secion with Ricoh Co., Ltd RIOCH THETA Z1.

Your first post showed that you were using /dev/video1 Do you only have one camera attached to your computer and the THETA Z1 is at /dev/video0?

Thank you for your help!
I use theta v. And my laptop no camera!!!. So when I plug the theta v to my computer, theta v is the only camera!


So the theta v is only camera, I change the codes of gst_viewer.cc:

This is tough as I can’t replicate the problem.

As a test, can you go into the bios of the laptop and disable the gtx2060 discrete GPU and use the onboard Intel Core i7-10875H integrated GPU using Intel UHD Graphics 630?


What’s working

You seem to have the correct device ID in streaming mode.
libuvc-theta-sample/thetauvc.c at f8c3caa32bf996b29c741827bd552be605e3e2e2 · ricohapi/libuvc-theta-sample · GitHub

manifold test system

for future reference

Ubuntu18.04
cpu:i7-10875H
RAM:16G
GPU:gtx2060
confirmed that laptop does not have a integrated webcam on /dev/video0.


If you’re still stuck

There’s another possible path for you to solve this. However, I have not tested the solution below yet. If you test it, please report back.

A community member has developed a GStreamer plug-in for the THETA.

If OpenCV is built with gstreamer backend enabled, VideoCapture::open() accepts the gstreamer pipeline description as its argument like

  VideoCapture cap;
  cap.open("thetauvcsrc ! h264parse ! decodebin ! videoconvert ! appsink");

opencv/cap_gstreamer.cpp at 1f726e81f91746e16f4a6110681658f8709e7dd2 · opencv/opencv · GitHub

you can capture image directly from gstreamer.

Please note that the standard image format on OpenCV is “BGR” which is not supported by most hardware assisted colorspace converter plugins, thus, you have to use software converter.

Since recent OpenCV accepts I420 or NV12 as input format for VideoCapture, you can capture without color conversion and convert using OpenCV if necessary.

If you decide to pursue this, it’s possible that gstreamer could be used with the ROS OpenCV camera driver if OpenCV is built with the gstreamer backend enabled.

  • Makefile in the gstthetauvc does not have “install” target. Please copy the plugin file (gstthetauvc.so) into appropriate directory. If you copied it into directory other than gstreamer plugin directory, you have to set the directory to the GST_PLUGIN_PATH environment variable.

  • The plugin has several properties to specify resolution and THETA to use(if multipe THETA are connected to the system).
    Run “gst-inspect-1.0 thetauvcsrc” for detail.

  • For Ubuntu 20 and Jetson users, OpenCV packages from Ubuntu official or nvidia have gstreamer backend.

  • For Ubuntu 18 users, official OpenCV packages do not have gstreamer backend and the version(3.2) is too old to use with the gstreamer. If you want to use on Ubuntu 18, build the latest OpenCV from the source, or AT YOUR OWN RISK, you can use unofficial OpenCV binary packages for Ubuntu 18 like

opencv-4.2 : “cran” team

Update on power supply to Z1 from USB hub.

A community member let me know that his test showed that for live streaming with the Z1, USB3 with Linux may be limited to 400mA to 450mA when live streaming with the Z1. This would result in a slow decrease in the Z1 battery if you’re streaming at 4K. This limitation of the power draw may be specific to the Z1 and Linux USB3 port.

The solution is to used a powered hub.

I’m still trying to test this myself.

Thank you craig. I use the theta v for coloring the point cloud of vlp16 of Velodyne. So, I need get the pictures of theta v every several miliseconds and save them in a directory in a jpeg or any other format.
Is the gstreamer can do this job?

Hi, I am not that experienced with gstreamer myself. I think other people on this forum have a similar application to you. I hope that other people can help test the gstreamer plugin for the theta and post recipes or examples of use.

This pipeline was contributed by @snafu666 using lossless huffman encoding on an nvidia jetson using the loopback. The input portion would needed to be modified.

gst-launch-1.0 v4l2src device=/dev/video99 ! video/x-raw,framerate=30/1 \
! videoconvert \
! videoscale \
! avenc_huffyuv \
! avimux \
! filesink location=raw.hfyu

I think we can use MTP protocol to get photoes from theta v. This maybe the easy way to solve the problem.

1 Like

Thanks for posting about libmtp. I hadn’t seen that project before. I was previously using libptp2 and libgphoto2.

Update on Power Supply for Live Streaming on NVIDIA Jetson and Linux x86

Workarounds for low current problem on Linux USB3 port.

If uvc device driver(uvcvideo) is loaded as a kernel module and you don’t use uvc device other than THETA,
disable uvc driver by adding uvc device driver to the driver blacklist.
e.g.

   sudo sh -c "echo blacklist uvcvideo >> /etc/modprobe.d/blacklist.conf"

or
Before connecting the THETA, disable autosuspend feature of the USB driver by writing -1 to the

/sys/module/usbcore/parameters/autosuspend.

e.g.

  sudo sh -c "echo -1 > /sys/module/usbcore/parameters/autosuspend"

Community member reports that Intel PC and Jetson Nano work fine.

Hi everyone, I have a critical issue when trying to launch v4l2loopback with 2 Theta V (On my Jetson AGX Xavier).
I build the kernel and instanciate 2 video devices, and when I launch the first camera everything s fine, the second one only works one time over 5. it shows “can’t open theta”. I have to deplug and plug again several times to make them work in parallel… Any ideas please why this happens ?

thank you in advance.

This is difficult for most people to test because they need two cameras and have to modify the C code. I haven’t tested this yet. However, I have seen a demo of it working. I don’t have a solution yet.

I noticed that there is a new commit to gstthetauvc

image

It’s possible, it handles multiple devices.

image

Hi Craig,
I have a new Theta z and I am trying to stream video from theta Z to the laptop using Python OpenCV and integrate on the ROS later. So far I am able to run stream the live video using gst_viewer. However, this doesn’t meet our requirement. Could you please let us know if it is possible to stream directly without using additional hardware.

I have followed this instruction ros_theta_z and ros_theta_s.
`$ v4l2-ctl --list-devices
Dummy video device (0x0000) (platform:v4l2loopback-000):
/dev/video1

BisonCam, NB Pro: BisonCam, NB (usb-0000:00:14.0-6):
/dev/video0`

As I am not able to see the theta z on my device lists, Could you please help me how to show theta z on the video device list. Thanks in advance!!

What version of Ubuntu/ROS are you using?

It’s possible to stream the Z1 to Linux x86 and NVIDIA Jetson without additional hardware.

$ v4l2-ctl --list-formats-ext --device  /dev/video1
ioctl: VIDIOC_ENUM_FMT
    Type: Video Capture

    [0]: 'YU12' (Planar YUV 4:2:0)
        Size: Discrete 1920x960
            Interval: Discrete 0.033s (30.000 fps)

NOTE: your resolution will be in 4K. I have mine set to 2K for testing.

Did you go to the site below, enter an email and review the documentation on v4l2loopback?

RICOH THETA360.guide Independent developer community

If you have problems, please post again.

Tried this and changed video99 to video0. The file raw.hfyu is getting larger very slowly, about 10 seconds for 1.2Mb. And after I use Ctrl+C to end the gst-launch-1.0, I cannot open the raw.hfyu.

BTW, I’m using the Intel NUC mini PC(x86 and no Nvidia GPU) as it is the only device that I can use gst_loopback and gst_viewer. The gst_viewer runs smoothly and gst_loopback runs with less than 0.1fps(10s/1frame).

So frustrating for theta Z1, I might use theta S instead.

Did the file not open with VLC on Linux?

In the video below, I completely purged the NVIDIA drivers from my system that has an NVIDIA graphics card in it and covered the entire process using X11 nouveau. As a further test, I used Wayland and not X11. You don’t need to use Wayland, but the build process for libuvc-theta and libuvc-theta-sample may help. I couldn’t use gst_loopback with Wayland in my first test. I don’t recommend you use Wayland at the moment.

Dear Craig

I gave up Theta Z1 with my NUC. Now by using my old Theta S, I can successfully output image and video using ROS + OpenCV. But there is another problem.

The video is dual-fisheye. It cannot be converted to equirectangular by RicohTheta Basic. It stuck in 100% no matter how small the video is.

The image or video have no metadata. The image can be converted but video cannot open.
Do you have any idea how I can transform these dual-fisheye video into equirectangular?