Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson

Thank for your reply.
Yes, THETA is on live mode.

$ lsusb
Bus 001 Device 031: ID 05ca:2712 Ricoh Co., Ltd 
Bus 001 Device 007: ID 04ca:005a Lite-On Technology Corp. 

$ ./gst_viewer 
Can't open THETA

If the THETA is turned off or not on live mode, the error “THETA not found” appeared.

I also did the change below but the I got the same error.

v4l2sink device=/dev/video0 qos=false sync=false;
pipe_proc = "nvv4l2decoder ! nv3dsink sync=false;
2 Likes

I was able to install v4l2loopback and run modprobe v4l2loopback to get my theta on /dev/video1

2 Likes

It works now! Cool!

It might be due to I got wrong libuvc?
git clone https://github.com/libuvc/libuvc —> “Can’t Open Theta” error
git clone https://github.com/ricohapi/libuvc-theta.git --> Work

2 Likes

Great news!

Yes, the libuvc at the URL you displayed has a patch for the THETA device information.

There’s a bunch of information at this site if you haven’t already seen it.

1 Like

Good to point this out explicitly. Thank you for posting this.

1 Like

Hi, this might be interesting for those who got it to work.

Based on this: HowTo: Viewing 360° Video in Real-Time from THETA V with Oculus Go Browser
and copying code from https://github.com/mganeko/aframe
to: https://github.com/kJh19/Test

I managed to live stream my theta from my PC to my phone and VR device as a 360 spherical image and even put them in VR/cardboard mode, using a webrowser.

The details are explained in the links, it works similar to skype or zoom.

2 Likes

Sorry that my previous post might be misleading.

I am still have ‘internal data flow error’ running ./gst_loopback.

Anyone got it to work on Ubuntu 16.04?

Is it possible to reduce the framerate of the Theta?
All my tests have been with 2K, when I try it with 4K, the sink of the pipeline has an increasing delay over time.
I think it takes too much time per frame due to the increased resolution. Or is it possible to drop frame if they’re too old?
Gst_viewer has a minimal delay for 2K and 4K, however.

1 Like

I believe the output of the camera is locked at 30fps.

How long do you use the stream before the latency starts increasing? We can try and replicate the test to isolate a way to avoid this.

It would be good to know:

  • beginning latency
  • end latency
  • elapsed time
  • your setup (such as if you’re using gst_viewer, the loopback, other software such as OpenCV)

I stream the theta and use:

sudo modprobe v4l2loopback exclusive_caps=1 max_buffer=8 width=3840 height=1920 framerate=30000/1001 and gst_loopback

if I use:

pipe_proc = "vaapidecodebin ! videoconvert ! videoscale ! video/x-raw,format=I420,width=1920,height=960,framerate=30000/1001 ! identity drop-allocation=true ! v4l2sink device=/dev/video2 qos=false sync=false";

and THETA MODE UHD(4K) or FHD(2K).

OR

pipe_proc = "vaapidecodebin ! videoconvert ! videoscale ! video/x-raw,format=I420 ! identity drop-allocation=true ! v4l2sink device=/dev/video2 qos=false sync=false"; 

with THETA MODE FHD(2K)

it comes in with barely any latency(500ms?) in 2K:

if I use

pipe_proc = "vaapidecodebin ! videoconvert ! videoscale ! video/x-raw,format=I420 ! identity drop-allocation=true ! v4l2sink device=/dev/video2 qos=false sync=false";

OR

pipe_proc = "vaapidecodebin ! videoconvert ! videoscale ! video/x-raw,format=I420,width=3840,height=1920,framerate=30000/1001 ! identity drop-allocation=true !

, and

THETA MODE UHD(4K)

It comes in with 4K, it starts with barely any latency however the longer it runs the higher the latency becomes. It did not matter with what I opened the v4l2loopback device (vlc, webpage, opencv).
The increase in latency happens once the v4l2loopback is accessed. It did not matter how long gst_loopback ran beforehand. Though the increased latency persists between sessions of accessing the v4l2loopback and does not reset. It resets if I stop and rerun gst_loopback.

If I run THETA MODE UHD(4K) with gst_viewer it comes in fine without latency.

1 Like

Hello Everyone,

Does anyone know the Gstreamer pipeline needed to record the streaming output with a Jetson AGX Xavier?
Just got mine today :slight_smile:

I can do some tests about latency and performance of the AGX Xavier compared to the Jetson Nano as I have both, but the Gstremer pipelines I used to display and record the video do not seem to work on the Jetson AGX Xavier

gst_viewer.c is working though

The autovideo detection does not seem to work on Xavier. you need to specify the decoder.

Example:

“decodebin ! autovideosink sync=false” to “nvv4l2decoder ! nv3dsink sync=false”

1 Like

My gst_viewer is working but with vlc i can only get 1 frame then it just sits there. My webcam works fin with everything. I tried getting a video but it can only record 1 frame and the video won’t capture. Need to get this working ASAP i have been trying for days and I’v come so far to get it working up to this point. Seems like there is something wrong with v4l2loopback and the frame rate/resolution. This is a brand new install of Ubuntu 20.04 so all libraries are new.

I replied in the other topic. Confirm that you added qos=false to the pipeline.

It should look something like this:

if (strcmp(cmd_name, "gst_loopback") == 0)
    pipe_proc = "decodebin ! autovideoconvert ! "
        "video/x-raw,format=I420 ! identity drop-allocation=true !"
        "v4l2sink device=/dev/video0 qos=false sync=false";

Also, you may be using software rendering, not hardware accelerated rendering. Post info on your GPU setup and also if you are using decodebin or nvdec.

You can also more questions, but just to let you know that there’s a bunch of information in the doc available here https://theta360.guide/special/linuxstreaming/ and there is search capability on that document.

Again, no problem if you keep asking questions. Just trying to help you out.

Hey Criag, Need your help while am trying to get live streaming for the Theta S 360 camera on Ubuntu 18. I followed steps to get gstviewer up and running and camera is also connected to the USB drive but not sure why but getting error ‘Theta not found’ while I run ./gst_viewer after installing libuvc-theta-sample. Any help, would be appreciated. Do you have all steps to be followed when live streaming of a Theta Camera is concern on Ubuntu 18?

The THETA S streams in MotionJPEG, which Linux supports without the driver. You do not need gstviewer with the THETA S.

Hi Folks.

I am trying to find a 360 deg camera suited to the Nitrogen6X board (1GHz quad cord ARM Cortex-A9 with 1 Gb ram) which we use. This is running Ubuntu 18.04. We wish to overlay our measurement from another sensor on a 360 deg camera feed and display it as a equirectangular stream. Is the theta S or theta V better suited to this (I understand the S offloads some of the image processing to the acquisition computer, which is in our case, low spec and that the V does this on board but might require newer OS and/or libraries which we may be unable to upgrade to). I could be mistaken in these understandings, which is why I am asking. Ideally we want to tax the processor on the Nitrogen6X board as little as possible and want to crack open the camera feed and overlay our data before displaying it.

Thanks for any input you have.

Get the THETA V.

  • output is in equirectangular for the V
  • output of the S in dual-fisheye format which you need to stitch yourself on the ARM board
  • S output is motionJPEG
  • V output is H.264
  • V can stream at 4K. S can stream at 2K

To use the S, the Linux kernel can handle it out of the box.

To use the V, you need to use the drivers documented on this thread and this site.

We have more examples running Ubuntu 20.04. I have not tried it with 18.04. Hopefully, it will work without additional modifications.

If your application benefits from a dual-fisheye feed, then the S might be easier for you to use. It should be lower cost as it is old.

You can try and download and compile the driver with the 18.04 before you get the camera. If you can compile gst_loopback and the sample code, then suggest you buy the V from a place with a return policy and test it soon after you get it.

You won’t be able to run the sample code without the camera as it looks for the USB ID of the camera when it first runs.