Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson

I have also tried to get live streaming via opencv in Python and the terminal outputs
Gstreamer warning: Embedded video playback halted; module v4l2src0 reported: Internal data stream error.

the code:

cap = cv2.VideoCapture(f'v4l2src device=/dev/video0 io-mode=2 ! image/jpeg, width=(int)3840, height=(int)2160 !  nvjpegdec ! video/x-raw, format=I420 ! appsink', cv2.CAP_GSTREAMER)

The developer of libuvc-theta left me a tip about the pipeline. See his comment at the end of the gist.

"thetauvcsrc ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! queue ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink"

Hello Folks,
Would you have a Gstream pipeline to record the theta v video in 30fps with a Jetson AGX Xavier? I am using openCv script but the latency is hugeā€¦

Examples - RICOH THETA Development on Linux

See the last one with H.265 from Nakamura_Lab as that one may be more similar to your use case.

As other people have pointed out, you may benefit from an NVME storage and not the microSD card as the microSD IO may be a bottleneck when storing the frames.

gst-launch-1.0 v4l2src num-buffers=600 \
device=/dev/video0 \
! video/x-raw \
! nvvidconv \
! nvv4l2h265enc \
! h265parse \
! qtmux \
! filesink location=test.mp4 -e

Consider using gstthetauvc

GitHub - nickel110/gstthetauvc: Gstreamer theta uvc plugin

Hello Everyone,
In order to avoid latency while recording with 2 theta V I was wondering if there is a way to record videos directly from the theta V in an external storage in order to grab the video later ? (the internal storage is way too small)

Thank you @craig , I used the pipeline and it works actually pretty fine apparently without any frames loss however it stops recording after exactly 20 secs every time.
Is it because of the buffer size ? would you have any idea please?
PS : I am using an NVME storage indeed

well working fine with no limits for buffer size (buffer size = -1)

1 Like

Oh, thanks for pointing that out.

1 Like

Hi guys, my setup is very similar to several people above - Iā€™m streaming a Ricoh Theta V connected to a Jetson Xavier via USB. Iā€™m using the thetauvcsrc from Nickel110 combined with a gstreamer pipeline based on this handy example.

I can view the stream in 4K in DirectX on my WLAN connected Windows machine using another simple gstreamer (for Windows) pipeline, and there is an acceptably small latency (<0.3s I would guess).

Instead, however, I want to render the stream into a skybox in Unity on a Windows machine, in a similar way to this example, but with an RTP stream, rather than a Theta that is directly connected to the Windows machine via USB. I would obviously like to keep latency to a minimum still.

Does anyone have this final piece of the puzzle? Or have any ideas about how to go about it?
My best ideas so far are:

  1. Use VLC, OBS or Gstreamer to ā€˜loopbackā€™ (if thatā€™s the right term) the RTP stream to a dummy Windows webcam device (as per v4l2loopback ) and then use an identical script to the Unity example above.
  2. Access and decode the H264 encoded RTP stream directly from a C# script in Unity and render that to the skybox texture - is that possible?
  3. Use a different pipeline on the Xavier side to make the Windows/Unity side simpler, or decode the stream on the Windows side using gstreamer and somehow pass that stream to Unity for rendering.

If you have any thoughts or ideas then let me know. I will also post an update if I manage to succeed!

I have no experience with this, but can you view the RTP stream on the Windows computer with something like VLC?

GitHub - videolan/vlc-unity: LibVLC plugin for Unity to integrate multimedia playback inside your Unity apps and games

The virtual camera support of Unity may be a little tricky to get working. See this last post. A community member sent me information that using the older Unity plug-in for virtual camera support might work.

Can RICOH THETA Z1 be used in Unity as a webcam? - #30 by craig

You may be able to go to the TwinCam Go research project and figure out how they get the stream to display in the headset.

Telepresence Using THETA V and Segway by TwinCam Go

As they are an academic research project, they may have published details on their implementation, possibly some open source code.

2 Likes

Can someone please explain to me what are plug-ins useful for?
This one for instance : Gstreamer theta uvc plugin
In order to enable live streaming over USB on Linux I followed @craig video tutorials that uses v4l2loopback, and thetauvc, it works well. Did I miss something ?

Thank you!

That is a plug-in for Gstreamer, not a plug-in for the camera. The gstreamer plugin allows the use of THETA live streaming without using v4l2loopback.

Some systems donā€™t work with v4l2loopback. In these cases, the gstreamer plug-in (software to plug into gstreamer) may make it easer to use things like OpenCV with the live video stream.

You may get lower latency with the gstreamer plug-in because the video does not have to go through the loopback.

1 Like

Hello again,

I was wondering something :
I am extracting all the frames as jpg images from the mp4 file recorded with Gstreamer and deleting some offset frames. Then I want to reconstruct a video from a given frame.
When inspecting the video file it says it is 30 fps but on the constructor website it says 29,7.
I donā€™t know which info to take to reconstruct the video from the given frames. Also do you think that the frame rate is 100% constant ?

Thank you

At least letā€™s say if I donā€™t even use Gstream and record in the internal memory of the cameras, would it be constant ? and at which frame rate, thank you !

it should be constant at 30fps at 4K resolution assuming that the storage IO is not a bottleneck.

Thank you @craig . Let me explain my issue then.
I record two videos simultaneously using Gstreamer with 2 Ricoh Theta V (using v4l2loopback).
Then I programmed a ā€œflashā€ sign periodically every [letā€™s say] X tens seconds and lasts 33ms.
Once done I extract the frames from the two videos(when inspecting both contain 30fps according to ubuntu video properties) in the order of the recording, I delete the frames before that flash sign so my first frame in my cam0 and cam1 correspond to the frame containing the flash and reindex the order of the frames.
However, when I inspect the second flash it does not correspond to the same index in the two cameras (usually around 20 frames of delay).
which means that when corresponding the indexes of first flash, the second one does not correspond to the same image indexes in the two cameras.

I am using a Jetson AGX Xavier, 2 Ricoh Theta V, v4l2loopback and recording using Gstream pipeline (last one discussed in this thread) and storing directly in an EVO disk that has 7000MB/s (7GB/s) writing speed.

Do you have any idea where does the issue come from, please?

hi @craig . everything works well as your tutorial to stream thetaZ1 except the very last part with gst streaming. when i run the command there is a blinking live sign on the camera but nothing happens on my computer like the steam appears on your screen. i have installed the v4l2loopback but maybe im not using it properlyā€¦ also the tutorial mentions to edit line 190 from the gst.c file but exactly what to do is absent. can you please guide im am new to both linux and theta. looking forward

1 Like

If you have a single camera on your computer, change this to /dev/video0

		"v4l2sink device=/dev/video1 sync=false";

The device index starts at 0. The first camera on your computer is /dev/video0

Just run gst_viewer first without gst_loopback to get the stream to appear on your local computer.

1 Like

@craig thanks for reply but it did not work for me. can you please guide me if there is something I may be doing wrong. I am using virtual box for ubuntu 20.04 while trying this code
image

only this happens and there is ā€œLIVEā€ blinking on my camera