Ricoh Theta V: Livestreaming with Jetson Xavier, ROS, OpenCV, NUC

Hello everyone :v:,

I am trying to set up the Ricoh Theta V in the Jetson Xavier via USB connection to use it with ROS and OpenCV.

  • ROS Melodic
  • Ubuntu 18.04
  • Jetson Xavier
  • OpenCV 4.5.3

So far:

  • ptpcam - is working
  • libuvc-theta - is working
  • libubv-theta-sample - I have problem with gst_loopback, but gst_viewer is working

:computer: gst_loopback


So far, gst_viewer is working great, but when I run gst_loopback, I am getting the following error:

start, hit any key to stop
Error: Internal data stream error.
stop

From the gst_viewer.c file, I edited the following, and the error is still there.

    if (strcmp(cmd_name, "gst_loopback") == 0)
    // original pipeline
        // pipe_proc = "decodebin ! autovideoconvert ! "
        //  "video/x-raw,format=I420 ! identity drop-allocation=true !"
        //  "v4l2sink device=/dev/video2 qos=false sync=false";
        //
        //modified pipeline below
        pipe_proc = "nvdec ! gldownload ! videoconvert n-thread=0 ! "
            "video/x-raw,format=I420 ! identity drop-allocation=true !"
            "v4l2sink device=/dev/video2 qos=false sync=false"; 
     else
        //pipe_proc = " decodebin ! autovideosink sync=false qos=false";
        pipe_proc = “nvv4l2decoder ! nv3dsink sync=false”;

:robot: ROS and OpenCV


To use the Ricoh Theta V with ROS, I installed ROS Melodic and OpenCV 4.5.3 (currently the latest version). Additionally, I tried the following packages and plugins:

  • gscam
  • gstreamer
  • cv_camera - didn’t work for me
  • libuvc_camera - didn’t work for me
  • video_stream_opencv - I will test it once I get the gst_loopback working, but the NUC is working great, so it looks promising in the Jetson Xavier
  • ros_deep_learning - still in progress

:video_camera: Ricoh Theta V


Ricoh Theta V uses UVC1.5 and H264 streaming.

:spiral_notepad: Notes


I will keep trying to make it work and keep posting. But if someone has the answer on how to fix this or has experience with the Ricoh Theta V, Jetson Xavier, and ROS, please feel free to share your knowledge. We will be happy to hear your input.

1 Like

Do you have more than one webcam attached to the Jetson? If it is only one webcam, then do this:

Note that the device is likely /dev/video0, not /dev/video2 unless you have two other webcams attached to the Jetson.

   "v4l2sink device=/dev/video0 qos=false sync=false"; 

show the output of

lsmod |grep v4l2loopback

Is the v4l2loopback module loaded into the kernel?

What does

ls -l /dev/video*

look like?

:computer: ./gst_loopback


Hello Craig, sorry for my late reply.

here is how the code looks:

	if (strcmp(cmd_name, "gst_loopback") == 0)
		//pipe_proc = "decodebin ! autovideoconvert ! "
		//	"video/x-raw,format=I420 ! identity drop-allocation=true !"
		//	"v4l2sink device=/dev/video7  qos=false sync=false";

		//modified pipeline below
        	pipe_proc = "nvdec ! gldownload ! videoconvert n-thread=0 ! "
            		"video/x-raw,format=I420 ! identity drop-allocation=true !"
            		"v4l2sink device=/dev/video7 qos=false sync=false";   

	else
		
		//pipe_proc = " decodebin ! autovideosink sync=false qos=false";
		pipe_proc = " nvv4l2decoder ! nv3dsink sync=false";

Yes, I have more than one camera connected. The Ricoh Theta V is in /dev/video7:

Dummy video device (0x0000) (platform:v4l2loopback-000):
	/dev/video7

The output from lsmod | grep v4l2loopback is:

v4l2loopback 42895 0

The v4l2loopback was compiled and installed as follow:

$ git clone https://github.com/umlaeute/v4l2loopback.git
$ cd v4l2loopback
$ make 
$ sudo make install
$ sudo depmod -a

ls -l /dev/video* displays:

/dev/video0  /dev/video2  /dev/video4  /dev/video6
/dev/video1  /dev/video3  /dev/video5  /dev/video7

When I run the ./gst_viewer command everything goes smoothly:

./gst_viewer

Opening in BLOCKING MODE
Opening in BLOCKING MODE 
start, hit any key to stop
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261

But when I use ./gst_loopback, I am getting the following error:

start, hit any key to stop
Error: Internal data stream error.
stop

Moderator note.

Adding system config again for convenience.

  • ROS Melodic
  • Ubuntu 18.04
  • Jetson Xavier
  • OpenCV 4.5.3

The full output with GST_DEBUG="*:3" for the ./gst_command is:

./gst_loopback
0:00:00.066441784 18263   0x55bc87d380 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "nvdec"!
0:00:00.066559162 18263   0x55bc87d380 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "nvdec"
0:00:00.066626140 18263   0x55bc87d380 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no sink [source=@0x55bc88e370]
0:00:00.075224500 18263   0x55bc87d380 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@0x55bc8b00f0]
0:00:00.112558760 18263   0x55bc87d380 ERROR           GST_PIPELINE grammar.y:414:gst_parse_element_set: no property "n-thread" in element "videoconvert0"
0:00:00.134856114 18263   0x55bc87d380 ERROR           GST_PIPELINE grammar.y:740:gst_parse_perform_link: could not link glimagesinkbin0 to videoconvert0
0:00:00.327292602 18263   0x55bc8c72d0 FIXME                default gstutils.c:3981:gst_pad_create_stream_id_internal:<ap:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
start, hit any key to stop
0:00:01.308465964 18263   0x55bc8c72d0 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<ap> error: Internal data stream error.
0:00:01.308537165 18263   0x55bc8c72d0 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<ap> error: streaming stopped, reason not-linked (-1)
0:00:01.308832851 18263   0x55bc8c72d0 WARN                   queue gstqueue.c:988:gst_queue_handle_sink_event:<queue0> error: Internal data stream error.
0:00:01.308904148 18263   0x55bc8c72d0 WARN                   queue gstqueue.c:988:gst_queue_handle_sink_event:<queue0> error: streaming stopped, reason not-linked (-1)
Error: Internal data stream error.
stop

What method did you use to install the nvdec driver on ubuntu 18.04? gst_loopback may not be able to find the nvdec driver. Maybe try the omx ones?

https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide/accelerated_gstreamer.html#wwpID0E0R40HA

Yes in fact:

$ gst-inspect-1.0 nvdec
No such element or plugin 'nvdec'

I tried to installed it using this method: Optimization - RICOH THETA Development on Linux. But I didn’t succeed.

GStreamer version:

$ gst-inspect-1.0 --version
gst-inspect-1.0 version 1.14.5
GStreamer 1.14.5
https://launchpad.net/distros/ubuntu/+source/gstreamer1.0

When running $ gst-inspect-1.0 | grep nvdec, nothing is returned.

gst-plugins-bad/sys/nvenc

gst-plugins-bad/sys/nvenc$ make  CCLD     libgstnvenc.la
/usr/bin/ld: cannot find -lnvidia-encode
collect2: error: ld returned 1 exit status
Makefile:832: recipe for target 'libgstnvenc.la' failed
make: *** [libgstnvenc.la] Error 1

Seems like the NVIDIA video codec SDK is only supported for desktop parts, and not on Tegra based platforms.

Did you try using omxh264dec ?

from the NVIDIA Jetson Linux Driver Package Software Features, Aug 3, 2021

the link you provided was only on x86. I should make that clear.

Thank you. I changed and compile:

pipe_proc = " nvv4l2decoder ! nv3dsink sync=false";

to

pipe_proc = " omxh264dec ! nv3dsink sync=false";

But I am still getting the same error:

$ $ ./gst_loopback
0:00:00.054209581  1430   0x55adfc8f90 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory "nvdec"!
0:00:00.054324462  1430   0x55adfc8f90 ERROR           GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "nvdec"
0:00:00.054364143  1430   0x55adfc8f90 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no sink [source=@0x55adfd2370]
0:00:00.076295255  1430   0x55adfc8f90 ERROR           GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@0x55adffe0f0]
0:00:00.113088274  1430   0x55adfc8f90 ERROR           GST_PIPELINE grammar.y:414:gst_parse_element_set: no property "n-thread" in element "videoconvert0"
0:00:00.208847348  1430   0x55adfc8f90 ERROR           GST_PIPELINE grammar.y:740:gst_parse_perform_link: could not link glimagesinkbin0 to videoconvert0
0:00:00.425091810  1430   0x55ae013e30 FIXME                default gstutils.c:3981:gst_pad_create_stream_id_internal:<ap:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
start, hit any key to stop
0:00:01.276000946  1430   0x55ae013e30 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<ap> error: Internal data stream error.
0:00:01.276069971  1430   0x55ae013e30 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<ap> error: streaming stopped, reason not-linked (-1)
Error: Internal data stream error.
stop
0:00:01.276574521  1430   0x55ae013e30 WARN                   queue gstqueue.c:988:gst_queue_handle_sink_event:<queue0> error: Internal data stream error.
0:00:01.276611705  1430   0x55ae013e30 WARN                   queue gstqueue.c:988:gst_queue_handle_sink_event:<queue0> error: streaming stopped, reason not-linked (-1)

just to confirm, you changed the decoder in the “gst_loopback” section, right?

Did it fail when you used nvv4l2decoder in the section where you check if the cmd_name == gst_loopback?

You are right, thank you. I modified the else block from:

pipe_proc = " nvv4l2decoder ! nv3dsink sync=false"; → to → pipe_proc = " omxh264dec ! nv3dsink sync=false";

Now with the right modification in the if block I am able to get a still image when running ./gst_loopback. Here is the modified piece of code (I also change back the else code to nvv4l2decoder ):

if (strcmp(cmd_name, "gst_loopback") == 0)
		//pipe_proc = "decodebin ! autovideoconvert ! "
		//	"video/x-raw,format=I420 ! identity drop-allocation=true !"
		//	"v4l2sink device=/dev/video7  qos=false sync=false";

		//modified pipeline below
        	pipe_proc = "omxh264dec ! gldownload ! glimagesink ! videoconvert n-thread=0 ! "
            		"video/x-raw,format=I420 ! identity drop-allocation=true !"
            		"v4l2sink device=/dev/video7 qos=false sync=false";   

	else
		
		//pipe_proc = " decodebin ! autovideosink sync=false qos=false";
		pipe_proc = " nvv4l2decoder ! nv3dsink sync=false";

and here is the current output of ./gst_loopback:

$ ./gst_loopback
0:00:00.081552632  1953   0x557f2e7f90 WARN                     omx gstomx.c:2826:plugin_init: Failed to load configuration file: Valid key file could not be found in search dirs (searched in: /home/spacer/.config:/etc/xdg/xdg-unity:/etc/xdg as per GST_OMX_CONFIG_DIR environment variable, the xdg user config directory (or XDG_CONFIG_HOME) and the system config directory (or XDG_CONFIG_DIRS)
0:00:00.118114731  1953   0x557f2e7f90 ERROR           GST_PIPELINE grammar.y:414:gst_parse_element_set: no property "n-thread" in element "videoconvert0"
0:00:00.208542428  1953   0x557f2e7f90 ERROR           GST_PIPELINE grammar.y:740:gst_parse_perform_link: could not link glimagesinkbin0 to videoconvert0
0:00:00.382915973  1953   0x557f387850 FIXME                default gstutils.c:3981:gst_pad_create_stream_id_internal:<ap:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.383258570  1953   0x557f387a80 FIXME           videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out:<omxh264dec-omxh264dec0> Sub-class should implement drain()
start, hit any key to stop

(gst_loopback:1953): GStreamer-CRITICAL **: 17:19:06.470: gst_caps_is_empty: assertion 'GST_IS_CAPS (caps)' failed

(gst_loopback:1953): GStreamer-CRITICAL **: 17:19:06.470: gst_caps_truncate: assertion 'GST_IS_CAPS (caps)' failed

(gst_loopback:1953): GStreamer-CRITICAL **: 17:19:06.470: gst_caps_fixate: assertion 'GST_IS_CAPS (caps)' failed

(gst_loopback:1953): GStreamer-CRITICAL **: 17:19:06.470: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

(gst_loopback:1953): GStreamer-CRITICAL **: 17:19:06.470: gst_structure_get_string: assertion 'structure != NULL' failed
0:00:01.202378507  1953   0x557f387a80 ERROR            omxvideodec gstomxvideodec.c:3274:gst_omx_video_dec_negotiate:<omxh264dec-omxh264dec0> Invalid caps: (NULL)

(gst_loopback:1953): GStreamer-CRITICAL **: 17:19:06.470: gst_:01.211908493  1953   0x557f387a80 FIXME           videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out:<omxh264dec-omxh264dec0> Sub-class should implement drain()
Allocating new output: 1920x960 (x 8), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 1920, nFrameHeight = 960 
0:00:01.226509588  1953   0x7f84004630 WARN                GST_PADS gstpad.c:4226:gst_pad_peer_query:<omxh264dec-omxh264dec0:src> could not send sticky events
0:00:05.611577847  1953   0x557f387940 WARN             glimagesink gstglimagesink.c:2313:gst_glimage_sink_on_close:<sink> Output window was closed
^CMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
0:00:01.211908493  1953   0x557f387a80 FIXME           videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out:<omxh264dec-omxh264dec0> Sub-class should implement drain()
Allocating new output: 1920x960 (x 8), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 1920, nFrameHeight = 960 
0:00:01.226509588  1953   0x7f84004630 WARN                GST_PADS gstpad.c:4226:gst_pad_peer_query:<omxh264dec-omxh264dec0:src> could not send sticky events
0:00:05.611577847  1953   0x557f387940 WARN             glimagesink gstglimagesink.c:2313:gst_glimage_sink_on_close:<sink> Output window was closed
^C

Please try the pipeline below and post the error.

	if (strcmp(cmd_name, "gst_loopback") == 0)
		pipe_proc = " nvv4l2decoder ! nv3dsink ! "
			"video/x-raw,format=I420 ! identity drop-allocation=true !"
			"v4l2sink device=/dev/video7 qos=false sync=false";

Assumptions:

  • ROS Melodic
  • Ubuntu 18.04
  • Jetson Xavier
  • OpenCV 4.5.3

Note the pipeline will likely be different on the NUC.

Sure, here is what I get:

$ ./gst_loopback
Opening in BLOCKING MODE
Segmentation fault (core dumped)

i tried again with this configuration and now the I’m getting live stream (not a still image):

if (strcmp(cmd_name, "gst_loopback") == 0)
        	pipe_proc = "omxh264dec ! gldownload ! glimagesink ! videoconvert n-thread=0 ! "
            		"video/x-raw,format=I420 ! identity drop-allocation=true !"
            		"v4l2sink device=/dev/video7 qos=false sync=false";
1 Like

:robot: OpenCV and ROS


Since ./gst_loopback is working now. I moved to OpenCV to be run with ROS.

Assumptions:

  • ROS Melodic
  • Ubuntu 18.04
  • Jetson Xavier
  • OpenCV 4.5.3
  • Python 2.7.17

Here is the output after I run the camera launch file:

$ roslaunch video_stream_opencv camera.launch
... logging to /home/spacer/.ros/log/fd07213a-0651-11ec-be65-00301802e303/roslaunch-NJX18-29658.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://NJX18:35287/

SUMMARY
========

PARAMETERS
 * /camera/camera_stream/camera_info_url: 
 * /camera/camera_stream/camera_name: camera
 * /camera/camera_stream/flip_horizontal: False
 * /camera/camera_stream/flip_vertical: False
 * /camera/camera_stream/fps: 29
 * /camera/camera_stream/frame_id: camera
 * /camera/camera_stream/video_stream_provider: 7
 * /rosdistro: melodic
 * /rosversion: 1.14.10

NODES
  /camera/
    camera_stream (video_stream_opencv/video_stream)

ROS_MASTER_URI=http://SXL00-200505AA:11311

process[camera/camera_stream-1]: started with pid [29670]
[ INFO] [1629983349.075224226]: Resource video_stream_provider: 7
[ INFO] [1629983349.083826373]: Getting video from provider: /dev/video7
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (1757) handleMessage OpenCV | GStreamer warning: Embedded video playback halted; module v4l2src0 reported: Internal data stream error.
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (886) open OpenCV | GStreamer warning: unable to start pipeline
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (480) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created
VIDEOIO ERROR: V4L2: Could not obtain specifics of capture window.
VIDEOIO ERROR: V4L: can't open camera by index 7
[ INFO] [1629983349.280850685]: Camera name: camera
[ INFO] [1629983349.289376895]: Throttling to fps: 29
[ INFO] [1629983349.297910177]: Publishing with frame_id: camera
[ INFO] [1629983349.306981416]: Provided camera_info_url: ''
[ INFO] [1629983349.319686218]: Flip horizontal image is : false
[ INFO] [1629983349.341639733]: Flip flip_vertical image is : false
[ERROR] [1629983349.341943319]: Could not open the stream.
[camera/camera_stream-1] process has died [pid 29670, exit code 255, cmd /home/spacer/summit_catkin_ws/src/video_stream_opencv/build/devel/lib/video_stream_opencv/video_stream camera:=image_raw __name:=camera_stream __log:=/home/spacer/.ros/log/fd07213a-0651-11ec-be65-00301802e303/camera-camera_stream-1.log].
log file: /home/spacer/.ros/log/fd07213a-0651-11ec-be65-00301802e303/camera-camera_stream-1*.log
all processes on machine have died, roslaunch will exit
shutting down processing monitor...
... shutting down processing monitor complete
done

I will try to fix this issue. I will keep posting.

Congratulations on getting the video stream working with

pipe_proc = "omxh264dec ! gldownload ! glimagesink ! videoconvert n-thread=0 ! "
            		"video/x-raw,format=I420 ! identity drop-allocation=true !"
            		"v4l2sink device=/dev/video7 qos=false sync=false";

As a test, can you view the stream in something like VLC or or gst-launch-1.0 on /dev/video7?

Example

$ cvlc v4l2:///dev/video7
VLC media player 3.0.9.2 Vetinari (revision 3.0.9.2-0-gd4c1aefe4d)
[000055573aea4db0] dummy interface: using the dummy interface module...

You should also be able to get the THETA information with vl2-ctl. Note that I reduced the resolution in the example below to 1920x960 as I only have a Jetson Nano and it was struggling with OpenCV at 4K.

$ v4l2-ctl --list-formats-ext --device  /dev/video7
ioctl: VIDIOC_ENUM_FMT
    Type: Video Capture

    [0]: 'YU12' (Planar YUV 4:2:0)
        Size: Discrete 1920x960
            Interval: Discrete 0.033s (30.000 fps)

Yes, thank you for your help. I really appreciate it. :slight_smile:

For some reason, everytime I launch VLC in the Jetson Xavier, the app keeps crashing. I guess the app is not compatible.

Regarding the information provided by v4l2-ctl. I am only getting the following:

$ v4l2-ctl --list-formats-ext --device  /dev/video7
ioctl: VIDIOC_ENUM_FMT
$ v4l2-ctl --list-devices
Dummy video device (0x0000) (platform:v4l2loopback-000):
	/dev/video7

Failed to open /dev/video0: No such file or director
$ v4l2-ctl -d7 -D
Driver Info (not using libv4l2):
	Driver name   : v4l2 loopback
	Card type     : Dummy video device (0x0000)
	Bus info      : platform:v4l2loopback-000
	Driver version: 4.9.201
	Capabilities  : 0x85208003
		Video Capture
		Video Output
		Video Memory-to-Memory
		Read/Write
		Streaming
		Extended Pix Format
		Device Capabilities
	Device Caps   : 0x85208003
		Video Capture
		Video Output
		Video Memory-to-Memory
		Read/Write
		Streaming
		Extended Pix Format
		Device Capabilities

Seems like the problem with ROS is due to OpenCV. I will try to launch a code in python for OpenCV to fix it, and the go back to ROS.

Did you build OpenCV from source?

Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson - #78 by zdydek

gscam doesn’t seem to handle udp streams well. I also tried using OpenCV VideoCapture to get the data into ros, but that had a couple issues. There are two APIs for VideoCapture that seemed appropriate: Gstreamer and FFmpeg. It turns out that the OpenCV version packaged with ROS is not built with Gstreamer support, so you would have to build OpenCV yourself to use it. For FFmpeg, the version of OpenCV packaged with ROS melodic is 3.2, which is missing a fairly critical change here: https://github.com/opencv/opencv/pull/9292 that allows you to set FFmpeg capture options in an environment variable. I got both of these working by upgrading OpenCV to version 3.4.9 and building from source, but Gstreamer had a low framerate (~5fps) and FFmpeg had a lot of corruption and dropped frames (maybe it was stuck using UDP?). So, I decided to stick with gscam for now.

The latency value of 400 worked for me, but should be tuned depending on your network.

Yes, I installed it directly from the source. The thing is that when I tried the same version of OpenCV and ROS in the Intel NUC there is no problem with the gstreamer. It is only now that I am setting the camera in the Jetson Xavier that things are getting complicated.

:camera: OpenCV


It seems that the problem is the pipeline from gstream to OpenCV. For now I will just be focusing in obtaining in make this connection using the simple code you suggested:


import numpy as np
import cv2

cap = cv2.VideoCapture(7, cv2.CAP_GSTREAMER)

while(True):
    # Capture frame-by-frame
    ret, frame = cap.read()

    # Our operations on the frame come here
    gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)

    # Display the resulting frame
    cv2.imshow('frame',gray)
    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

# When everything done, release the capture
cap.release()
cv2.destroyAllWindows()

But I am getting the following error:

[ WARN:0] global /home/spacer/Downloads/opencv-4.5.3/modules/videoio/src/cap_gstreamer.cpp (2057) handleMessage OpenCV | GStreamer warning: Embedded video playback halted; module v4l2src0 reported: Internal data stream error.
[ WARN:0] global /home/spacer/Downloads/opencv-4.5.3/modules/videoio/src/cap_gstreamer.cpp (1034) open OpenCV | GStreamer warning: unable to start pipeline
[ WARN:0] global /home/spacer/Downloads/opencv-4.5.3/modules/videoio/src/cap_gstreamer.cpp (597) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created
Traceback (most recent call last):
  File "test.py", line 11, in <module>
    gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
cv2.error: OpenCV(4.5.3) /home/spacer/Downloads/opencv-4.5.3/modules/imgproc/src/color.cpp:182: error: (-215:Assertion failed) !_src.empty() in function 'cvtColor'

I also tried the following and the outcome was the same:

gst = "omxh264dec ! gldownload ! glimagesink ! videoconvert n-thread=0 ! video/x-raw,format=I420 ! identity drop-allocation=true ! v4l2sink device=/dev/video7 qos=false sync=false"

and

gst = "v4l2src device=/dev/video7 ! video/x-raw,width=1920,height=1080,format=I420,framerate=30/1 ! videoconvert ! video/x-raw,format=BGR ! appsink"

in

cap = cv2.VideoCapture(gst, cv2.CAP_GSTREAMER)