Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson

I’ve been using Jetpack 4.6, which corresponds to L4T R32, Revision 6.1.

cat /etc/nv_tegra_release 
# R32 (release), REVISION: 6.1

I recently met with a community member running L4T R32, Revision 7.1 and having problems with RICOH THETA X firmware 2.10.1.

To replicate the test with the newer version of Jetpack, I’m going first take a baseline test with Jetpack 4.6. I’ll then update this post with the install of Jetpack 4.6.4 (newest for Nano).

In this example, I’ve renamed ptpcam to theta in order to test the modified and unmodified versions of ptpcam on the same Jetson.

theta --info

THETA Device Info
==================
Model: RICOH THETA X
  manufacturer: Ricoh Company, Ltd.
  serial number: '14010001'
  device version: 2.10.1
  extension ID: 0x00000006
  image formats supported: 0x00000004
  extension version: 0x006e
ls
gst_loopback  gst_viewer.c  Makefile    thetauvc.h
gst_viewer    gst_viewer.o  thetauvc.c  thetauvc.o
craig@jetpack-4:~/Development/libuvc-theta-sample/gst$ ./gst_loopback 
start, hit any key to stop
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 

latency

At 4K, latency is 630ms.

I believe that the top/bottom correction of the live stream can be disabled with this api

theta-api-specs/theta-web-api-v2.1/options/_top_bottom_correction.md at main · ricohapi/theta-api-specs · GitHub

This would likely reduce latency.

install jetpack

Using etcher with this file.

image

There appears to be another install method using the SDK manager.

SDK Manager | NVIDIA Developer

I decided to use etcher.

after upgrade

R32 release is now higher. Previous was 6.1.

cat /etc/nv_tegra_release 
# R32 (release), REVISION: 7.1

/usr/local/lib seems like it will load.

/etc/ld.so.conf.d$ cat libc.conf 
# libc default configuration
/usr/local/lib

from home directory

mkdir Development
cd Development/
git clone https://github.com/ricohapi/libuvc-theta.git 
cd libuvc-theta/
mkdir build
cd build/
cmake ..
-- Could NOT find JPEG (missing: JPEG_LIBRARY JPEG_INCLUDE_DIR) 
-- Checking for module 'libjpeg'
--   No package 'libjpeg' found
-- Looking for pthread_create in pthreads - not found

sudo apt install libjpeg-dev
sudo apt install doxygen

rerun cmake …

cmake ..
-- libusb-1.0 found using pkgconfig
-- Found JPEG: /usr/lib/aarch64-linux-gnu/libjpeg.so  
-- Found JPEG library using standard module
-- Building libuvc with JPEG support.
-- Configuring done
-- Generating done
-- Build files have been written to: /home/craig/Development/libuvc-theta/build
make
sudo make install
[ 45%] Built target uvc_static
[ 90%] Built target uvc
[100%] Built target example
Install the project...
-- Install configuration: "Release"
-- Installing: /usr/local/lib/libuvc.so.0.0.6
-- Installing: /usr/local/lib/libuvc.so.0
-- Installing: /usr/local/lib/libuvc.so
-- Installing: /usr/local/include/libuvc/libuvc.h
-- Installing: /usr/local/include/libuvc/libuvc_config.h
-- Installing: /usr/local/lib/libuvc.a
-- Up-to-date: /usr/local/include/libuvc/libuvc.h
-- Up-to-date: /usr/local/include/libuvc/libuvc_config.h
-- Installing: /usr/local/lib/cmake/libuvc/libuvcTargets.cmake
-- Installing: /usr/local/lib/cmake/libuvc/libuvcTargets-release.cmake
-- Installing: /usr/local/lib/cmake/libuvc/FindLibUSB.cmake
-- Installing: /usr/local/lib/cmake/libuvc/FindJpegPkg.cmake
-- Installing: /usr/local/lib/cmake/libuvc/libuvcConfigVersion.cmake
-- Installing: /usr/local/lib/pkgconfig/libuvc.pc
-- Installing: /usr/local/lib/cmake/libuvc/libuvcConfig.cmake

use ldconfig

ldconfig -v
/usr/local/lib:
	libuvc.so.0 -> libuvc.so.0.0.6

install libuvc-theta-sample

modified version

git clone https://github.com/codetricity/libuvc-theta-sample.git

overkill. install everything related to gstreamer.

sudo apt-get install libgstreamer1.0-0 gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudio libgstreamer-plugins-base1.0-dev

modify gst_viewer.c for single camera on nano.

if (strcmp(cmd_name, "gst_loopback") == 0)
        pipe_proc = "decodebin ! autovideoconvert ! "
                "video/x-raw,format=I420 ! identity drop-allocation=true !"
                "v4l2sink device=/dev/video0 qos=false sync=false";

make

cd gst/
make

error

./gst_viewer 
works as expected

install gstthetauvc

git clone https://github.com/nickel110/gstthetauvc
Cloning into 'gstthetauvc'...
cd gstthetauvc/thetauvc
make
sudo cp gstthetauvc.so /usr/lib/aarch64-linux-gnu/gstreamer-1.0/

gstthetauvc test

gst-launch-1.0 thetauvcsrc mode=4K ! queue ! h264parse ! nvv4l2decoder ! queue ! nv3dsink sync=false
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is PREROLLING ...
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

opencv test

pip install opencv-python

Found 1 Theta(s), but none available.


I received a question from a community member about the error message above. There is information in the post below and especially the issue on GitHub.

Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson - #348 by craig.


The main point is to purge libuvc.so from your system, which may be in libuvc-dev.

sudo apt purge libuvc-dev

Hello Craig, I am current facing a issue with getting a theta z1 running on a jetson nano. I have tried all the approaches mentioned in this forum but still getting the ‘Can’t open THETA’ bug

  1. My camera is on live stream mode and lsusb shows: ID 05ca:2715 Ricoh Co., Ltd (camera model is not shown but ID seems to be correct since switching it from live to still gives a different ID)
  2. I have installed libuvc-theta and sample following this link: RICOH THETA Development on Linux
  3. My libuvc and v4l2 are installed following your Youtube videos, pkg libuvc shows: 0.0.6; modinfo v4l2loopback shows 0.13.1-190g4824538.
  4. As for Gstreamer, i followed the steps from Jaap’s Raspberry Pi setups, version number is 1.14.5
  5. I also tested this camera and cable on a NUC (13th gen i5) of mine but i got choppy and laggy streams. It is worth noting that i cannot get intel graphics driver work on this NUC. The warning after booting gst_viewer is saying the kernal (5.4 on ubuntu 20) is too old for the Intel Iris GPU which i think OpenGL will call it thus affect the speed of gst_viewer eventually? If possible can anyone using a ubuntu pc with CPU only graphics (llvmpipe) help me confirm this?
  6. To finally confirm if the camera is not malfunctioning, i also tested on another older NUC (8th gen i7) and a newer Desktop PC (12th gen i5 with 3060Ti), all works flawlessly following the exact same configuration steps.

I managed to solve the NUC-no-GPU issue, just change ‘glimagesink’ to ‘v4l2sink’, because glimagesink displays video via OpenGL, which will try to look for a non-existing GPU for hardware-accelerated rendering.

2 Likes

oh, that’s good news. So, you have it running with no problems now?

BTW, I did a test with a Ryzen 5700u with no external GPU and the Linux streaming seemed to work fine.

I recently got a fresh Jetson Nano today and was testing OpenCV. I’m posting my Python script below for a basic test. Note that I cut down the resolution to 2K. Also, I’m getting an undervoltage throttling error. I’m going to need to buy a better power supply. However, even with the current throttling issue, I was able to do basic frame resizing test of OpenCV using gstthetauvc. The new unit also does not have a fan and I may be getting thermal throttling as well. I’m making a video of the complete setup from opening the box. I will post here when I complete the video

import cv2

# cap = cv2.VideoCapture(0)
cap = cv2.VideoCapture("thetauvcsrc mode=2K ! decodebin ! autovideoconvert ! video/x-raw,format=BGRx ! queue ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink");


# Check if the webcam is opened correctly
if not cap.isOpened():
    raise IOError("Cannot open webcam")

while True:
    ret, frame = cap.read()
    frame = cv2.resize(frame, None, fx=0.25, fy=0.25, interpolation=cv2.INTER_AREA)
    cv2.imshow('Input', frame)

    c = cv2.waitKey(1)
    if c == 27:
        break

cap.release()
cv2.destroyAllWindows()
1 Like

I just ordered the components below for the Jetson Nano. If people are suffering from super slow framerates, they should make sure that they are using a stable power supply connected to the 5V 4A power supply and not the 5V 2A microUSB connector.

I’m getting this error during the stream.

image

The people on the NVIDIA forum are suggesting a 5.5V 3A or greater power source. I couldn’t find a reliable 5.5V 3A or greater power source, so I ordered the Seed Studio one above and will see if it works. People on the forum were attaching scopes to the input power and making custom wires.

If the seed studio power supply doesn’t work to eliminate the power problem, I’m going to get a bench power supply like the one below to try and supply 5.5V 3A.

link

Yes, since NUC worked for me, I will stick with it! As for the nano issue, i cant solve it. I will try to do it in the future i guess. Thanks for the help!

Another thing i want to ask is: there are about 0.8s~1s delays when using gst_viewer or gst_loopback. Is there a way to reduce this delay?

I’m making a video on using the NVIDIA Jetson B01 with 4GB of RAM. It’s a fresh install using gstthetauvc. I got unsatisfactory framerates in my last test. However, I was getting current errors constantly on my screen. I’ll post the video here when I’m done as it may help you get gstthetauvc working.

I ran a test using hardware acceleration of a discrete NVIDIA graphics card showing a 50% reduction in latency.

There is also information on compiling ffmpeg for the Jetson at the end of that link.

The lowest latency achievable is approximately 250ms. The primary source of latency is the internal camera stitching pipeline. There is nothing we can do as a community to bypass the pipeline.

These things are in our control and may have some impact, especially if the latency is considerably above 250ms.

  • hardware acceleration of video compression
  • recompile either gstreamer or ffmpeg from source with optimizations for your specific hardware
  • recompile opencv (if you’re using it) from source with optimizations (if you have a separate GPU or a jetson, compile with CUDA support)
  • if you’re using a single board computer, make sure it has a stable power supply. The Jetson Nano microUSB port doesn’t seem to deliver enough power for a good stream
  • make sure that the computer is not thermal throttling. You can just point a standard household fan on your CPU for a test

I’ve used this video in the past

It’s going to take a while for my power supply to arrive.

If anyone has a LattePanda Sigma or similar, see if you can give @mummtaznim advice on getting the THETA X to show up on the LattePanda. It’s running an x86 CPU, so it should be no problem, but he’s getting stuck.