Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson

Stupidity on my part, realised that gst_loopback was set to /dev/video1 by default, changing that to /dev/video0 meant that FFmpeg then works if you set the input device to /dev/video0. Streaming to YouTube working. If anyone is interested this is the FFmpeg command

ffmpeg -f lavfi -i anullsrc
-f v4l2 -s 3480x1920 -r 10 -i /dev/video0
-vcodec libx264 -pix_fmt yuv420p -preset ultrafast
-strict experimental -r 25 -g 20 -b:v 2500k
-codec:a libmp3lame -ar 44100 -b:a 11025 -bufsize 512k
-f flv rtmp://

Will need some work on optimisation but gives me what I need.


Wow, this is fantastic! Thank you for posting this. Geez, you really know ffmpeg. Way to go. :slight_smile:

Got it working with 360 navigation. Thanks.

Also, as I don’t normally use ffmpeg, I couldn’t figure out what many of the options mean.

I’m using the simplified command below.

What does -strict experimental -r 25 -g 20 do?

 ffmpeg -f lavfi -i anullsrc -f v4l2 -s 1920x960 -r 10 -i /dev/video2 \
-vcodec libx264 -pix_fmt yuv420p \
 -b:v 2500k \
-codec:a libmp3lame -ar 44100 -b:a 11025 -bufsize 512k \
-f flv rtmp://$SECRET_KEY

I’ve cut down the stream resolution to preserve my daughter’s Zoom classes.

1 Like

It’s to do with the audio encoder. Youtube expects an audio stream, I haven’t looked at audio yet, so in fact they are redundant here and can be left out.

1 Like

Thanks again for you help. I’m putting this into the community documentation and providing attribution to you. I feel many people want to do this. :slight_smile:

Are you streaming from Linux to YouTube in order to use low-cost Jetson devices? Or, are you choosing Linux because you can more easily control the stream with AI or remote management?

Just wondering as I feel that Linux is very flexible, but curious as to why people are so interested in this platform for streaming.

Ahh this is just step 1 and the Jetson is just a cheap device to get something working. However I was intrigued as to how much it would do. I need to use FFmpeg as I have some other built in encoders that do some special stuff when it comes to processing the video. So this is the first stage of the pipeline. YouTube is just a for free test environment.

The Jetson does however open up the possibility of a simple portable 360 streaming device though.


Thanks for the help and the great webinar yesterday. Using the v4l2loopback capability and thetaV loopback example, here are 2 example gstreamer pipelines to grab the video:

As a lossless huffman encoded raw file:

gst-launch-1.0 v4l2src device=/dev/video99 ! video/x-raw,framerate=30/1
! videoconvert
! videoscale
! avenc_huffyuv
! avimux \

! filesink location=raw.hfyu

And with default h.264 encoding on a Jetson:

gst-launch-1.0 v4l2src device=/dev/video99 ! video/x-raw,framerate=30/1
! nvvidconv
! omxh264enc
! h264parse ! matroskamux \

! filesink location=vid99.mkv

Pro tip, when in install v4l2loopback, use the video_nr option to create the video device somewhere high so it does not get displaced by PnP of other cameras.


Many people may not realize is that UDP is lossy and TCP does retransmissions which will cause additional latency. I haven’t looked at the characteristics of the theta V stream, but if it is not constant bit rate, there will be network nano-bursts. (I’m talking about bursts at timescales shorter than 1ms) These bursts can overflow buffer space causing packet losses. For TCP connections, there is a retry-retransmission protocol that is a normal part of flow control, but will increase packet jitter and latency.



I added your great contribution to the documentation.

Tested the first pipeline on x86 and will test on Jetson after I reinstall the OS. I messed up my v4l2 system on the Jetson a little while ago and need to reinstall JetPack.

If anyone else wants to try this, VLC can play the Huffyuv format.

Thanks for posting the pipeline. @Mehdi_Zayene was asking about how to save the stream to file from a Jetson on a YouTube video comment thread.

For the H.264 file, this is the pipeline I’m using on my x86 system.

$ gst-launch-1.0 v4l2src device=/dev/video2 ! video/x-raw,framerate=30/1 ! autovideoconvert ! nvh264enc ! h264parse ! matroskamux ! filesink location=vid_test.mkv

This time, I tested it with gst-launch-1.0 playbin.

gst-launch-1.0 playbin uri=file:///path-to-file/vid_test.mkv

You may also want to check this out, I haven’t done much other than compiled it all and run it

It gives me a doubling of the frame rate on the jetson. Use this as the guide for ffmpeg compilation on the jetson

but use the ffmpeg git repo mentioned in the jetson-ffmpeg github as part of the ffmpeg compilation.


Wow. amazing. Is the version that is installed with “apt” not using hardware acceleration fully?

I guess this module works for both hardware encoding and hardware decoding?


I’m going to add this to the community document. thanks!

The Nvidia forums seem to indicate they are still thinking about whether to optimise FFmpeg. So the standard ffmpeg doesn’t seem to be.

1 Like

Thanks. This is very useful to know as the 4K streams are a bit heavy to deal with. There’s some nice satisfaction in pushing that little Nano. The Xavier NX would also be nice, but a bit out of my budget at the moment.

hi, What command line would you use to save the stream of the Jetson Nano in a file video with the help of this project ?

Hello sir, I copy pasted exactly your command line and got :
" h264parse: command not found"
Any idea?

Actually even the first line alone gives me the following :
GStreamer-CRITICAL **: 00:03:56.424: gst_element_make_from_uri: assertion ‘gst_uri_is_valid (uri)’ failed
WARNING : erroneous pipeline: no element “video”

NB: I am running it on my Jetson Nano, with gst_loopback running

This was contributed by snaffu666.

You must adjust the /dev/video99 to the device of your THETA.

gst-launch-1.0 v4l2src device=/dev/video99 ! video/x-raw,framerate=30/1
! nvvidconv
! omxh264enc
! h264parse ! matroskamux \
! filesink location=vid99.mkv

Did you install all the plug-ins, include gst-plugins-bad?

Use the command below to identify missing gstreamer plug-ins and where to install it from.

$ gst-inspect-1.0 h264parse
Factory Details:
  Rank                     primary + 1 (257)
  Long-name                H.264 parser
  Klass                    Codec/Parser/Converter/Video
  Description              Parses H.264 streams
  Author                   Mark Nauwelaerts <>

Plugin Details:
  Name                     videoparsersbad
  Description              videoparsers
  Filename                 /usr/lib/aarch64-linux-gnu/gstreamer-1.0/
  Version                  1.14.5
  License                  LGPL
  Source module            gst-plugins-bad
  Source release date      2019-05-29
  Binary package           GStreamer Bad Plugins (Ubuntu)
  Origin URL     

Paste your entire pipeline from gst-launch-1.0 into the forum.

Paste your code snippet of gst_viewer.c from roughly line 186 to line 193. Make sure the snippet includes the code after pipe_proc = in the if statement for gst_loopback.

the plugin probably is not installed in your system. the command “gst-inspect-1.0” will tell you what gstreamer elements are installed on your system.

i believe that h264parse is part of gstreamer1.0-plugins-bad


That’s a great point. If @Mehdi_Zayene doesn’t have the h264parse plug-in installed, then gst-inspect-1.0 h264parse will show nothing.

One way to find out what Ubuntu package the plug-in is located in is to first search on Google or the gstreamer site.

Once you know it is in the plugins bad package, you can try apt-cache search to identify the name of the package

Alternately, you can install all the plug-ins, the good, the bad, and the ugly.

1 Like