Thanks again for you help. I’m putting this into the community documentation and providing attribution to you. I feel many people want to do this.
Are you streaming from Linux to YouTube in order to use low-cost Jetson devices? Or, are you choosing Linux because you can more easily control the stream with AI or remote management?
Just wondering as I feel that Linux is very flexible, but curious as to why people are so interested in this platform for streaming.
Ahh this is just step 1 and the Jetson is just a cheap device to get something working. However I was intrigued as to how much it would do. I need to use FFmpeg as I have some other built in encoders that do some special stuff when it comes to processing the video. So this is the first stage of the pipeline. YouTube is just a for free test environment.
The Jetson does however open up the possibility of a simple portable 360 streaming device though.
Thanks for the help and the great webinar yesterday. Using the v4l2loopback capability and thetaV loopback example, here are 2 example gstreamer pipelines to grab the video:
Pro tip, when in install v4l2loopback, use the video_nr option to create the video device somewhere high so it does not get displaced by PnP of other cameras.
Many people may not realize is that UDP is lossy and TCP does retransmissions which will cause additional latency. I haven’t looked at the characteristics of the theta V stream, but if it is not constant bit rate, there will be network nano-bursts. (I’m talking about bursts at timescales shorter than 1ms) These bursts can overflow buffer space causing packet losses. For TCP connections, there is a retry-retransmission protocol that is a normal part of flow control, but will increase packet jitter and latency.
I added your great contribution to the documentation.
Tested the first pipeline on x86 and will test on Jetson after I reinstall the OS. I messed up my v4l2 system on the Jetson a little while ago and need to reinstall JetPack.
If anyone else wants to try this, VLC can play the Huffyuv format.
Thanks. This is very useful to know as the 4K streams are a bit heavy to deal with. There’s some nice satisfaction in pushing that little Nano. The Xavier NX would also be nice, but a bit out of my budget at the moment.
Actually even the first line alone gives me the following :
GStreamer-CRITICAL **: 00:03:56.424: gst_element_make_from_uri: assertion ‘gst_uri_is_valid (uri)’ failed
WARNING : erroneous pipeline: no element “video”
NB: I am running it on my Jetson Nano, with gst_loopback running
Paste your entire pipeline from gst-launch-1.0 into the forum.
Paste your code snippet of gst_viewer.c from roughly line 186 to line 193. Make sure the snippet includes the code after pipe_proc = in the if statement for gst_loopback.
You must adjust the C code for gst_viewer.c to match the /dev/video* device of your particular setup. Likely, you need to set it to /dev/video0. Alternately, you can use the tip by snaffu666 and use the video_nr option of v4l2loopback.
Hello again sir,
so I installed the gstramer plug-ins, instanciated the video device as vid99 as suggested and adjusted the C code of gst_viewer.c and still get the same error,
here you can find my pipeline and my error and also the result I get when I inspect h264parse.
Thank you for your help !!!