Main Virtual Tours SC2 Linux

Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson


Any new updates on this post?

I saw a demo of usermode Linux driver streaming from a THETA V several months ago. However, the developer got busy with other things and has not released the driver yet.

As of April 16, 2020, I do not know of a streaming driver for Linux.

Any new updates regarding getting live stream from Theta V or Z1 in Ubuntu ?

I’ll check with the developer again. Hopefully, he’ll put it out this month or next month.

1 Like

I tried to get live streaming over USB from Theta Z1 in Ubuntu (16.04.6 LTS). However there was a segmentation fault error while running your sample code from . Do you have any suggestion on how to fix this? segmentation_falut

I just tested it today, July 29, 2020. It worked on my system without mods to the c code or example. I’m going to make a short video on the build and dependency process for the patched libuvc. I will try and post this video today. There was a long list of dependencies that I needed to install to build libuvc properly.

Go to this link

Put your email in and you will see this page with a new video. It shows the full download from GitHub, compile, run, along with the dependencies. Please post again if you have problems.

I’m trying to use libuvc for RICOH THETA, which is just released.

I’ve updated the firmware for THETA V to the latest one(3.40.1).
I successfully built the library and the sample program on both two Ubuntu 18.04.4 LTS environments: Jetson TX2 and a Generic PC.
However, sample programs dont work with the error “uvc_open: Not supported (-12)” on both environments.
Is there anyone who faces the same issue?

i get the gstreamer-app-1.0 ,and try by;
when uvc_open ,it gets -12(device not support). but my device is theta thetaV and it is on live model

i get the gstreamer-app-1.0 ,and try by;

how to get gstreamer-app-1.0 on ubuntu? i can not find it.

I believe it is in libgstreamer-plugins-base1.0-dev

See the video that is now on this info page of this site. (put your email in and you will see a different page)

Please review the section on libuvc compilation with the appropriate dependencies and patch in the video.

update July 29, 11:30am PDT: I added a small troubleshooting section under the video, including a screen shot of remote git branch usage and package location of gstreamer-app-1.0

thank you ,it is ok,and libusb need above 1.0.9;

1 Like

v0.0.6-theta-uvc released 2 hours ago. I am trying to compile on a Raspberry Pi. I will report back.

thanks,I have try. it is OK。
by the way,Can i get livestreaming by usb on android?

I want try control ThetaV by usb on android.

Thank you for uploading the video. It works for me now. The issue earlier was that I build the master branch instead of theta_uvc.
Also, I would like to ask if it is possible to make the Ricoh camera be detected as a video device. In that way, I can directly use it in ROS, OpenCV, or other robotics/computer vision applications.


Hi, I didn’t write the sample code or the patch, but I’ve seen the THETA Z1 accessed from /dev/video0 on Linux. I’m trying to get this to work myself. I’ll share what I know and let’s work together on a solution that we can post here.

I’ve seen the following work on Linux:

  • gst-launch-1.0 from command line accessing /dev/video0
  • Yolov3
  • openpose

Here’s what I’ve seen work but can’t get to work myself.

Note that omxh264dec is specific to NVIDIA Jetson. x86 needs to use vaapih264dec or msdkh264dec (I think. I don’t have it working)

sudo modprobe v4l2loopback
python3 | gst-launch-1.0 fdsrc ! h264parse ! queue ! omxh264dec ! videoconvert ! video/x-raw,format=I420 ! queue ! v4l2sink device=/dev/video0 sync=true -vvv

./darknet detector demo cfg/ cfg/yolov2.cfg yolov2.weights

in openpose/build/examples/openpose

openpose.bin --camera 0

I’m now working on an x86 Linux. I was able to build the driver and demo on a Raspberry Pi 3, but I was not able to run the demo. It would hang with no error messages. It might be possible with a Raspberry Pi 4.

I am going through the information for Intel Video and Audio for Linux to try and get a better fundamental understanding.

I don’t know what v4l2loopback does, but I’ve built the kernel module and I’ve installed it.

It did create a video device on my system. The first device is my webcam.

$ ll /dev/video*
crw-rw----+ 1 root video 81, 0 Jul 28 18:21 /dev/video0
crw-rw----+ 1 root video 81, 1 Jul 30 10:21 /dev/video1

This section from the of the v4l2loopback site seems promising, but I don’t have it working at the moment.

the data sent to the v4l2loopback device can then be read by any v4l2-capable application.

If you get something working, please post. It looks super-promising, but I can’t access it from a /dev/video at the moment.

Hi, thanks for the reply. I will take a look into it and see if I can make it work.