Internal data stream error for theta Z1 using gst_viewer

Internal data stream error for theta Z1 using gst_viewer in libuvc-theta-sample, while gst_loopback can run successfully

Can you access the video on /dev/video* but not on the local Linux screen?

Are you running it on NVIDIA Jetson or Linux on x86? What version of Linux? Example: Ubuntu 20.04.

If you’re running on x86, you can try the X.Org driver instead of the proprietary NVIDIA driver. Another test is to disable your dedicated graphics card and run it from the iGPU.

You can also try recompiling gstreamer from source and using the community plug-in for gstreamer below.

GitHub - nickel110/gstthetauvc: Gstreamer theta uvc plugin

I can access /dev/video0. I’m runing a x86 Ubuntu 20.04 virtual machine on VMware and also tested on a non virtual-machine x86 Ubuntu 20.04. The same internal data stream error problem happened.

And I can go 4K live stream in windows using OBS so the theta Z1 is fine.

Will try and feed back.

We did the same thing following Setting up RICOH THETA Z1 · Husarion Docs on a third computer Intel NUC running Ubuntu 18.04, and it worked. The difference might be the previous two computer uses Ubuntu 20 and The Intel NUC has no Nvidia graphic card.

Now gst_viewer can show the live video and we are working on gst_loopback with ROS. First attempt failed following Setting up RICOH THETA Z1 · Husarion Docs.

For NVIDIA graphics card, try modify the pipeline to use

pipe_proc = "nvdec ! glimagesink qos=false sync=false";

instead of

pipe_proc = " decodebin ! autovideosink sync=false";

https://codetricity.github.io/theta-linux/optimization/

You may be able to get it to work by downloading the pre-compiled binaries for gst-plugins-bad. However, I needed to compile it from source on my Ubuntu 20.04 x86.

Option 2

The problem might be related to the NVIDIA proprietary graphics driver you may be using.

You can also try to uninstall all the NVIDIA graphics drivers and use X.org Noveau.

You can check the graphics driver you are using with glxinfo -B

Note and Caution About Potentially Losing Graphics on your Workstation

Personally, if I would use the NVIDIA driver if you can get it to work. However, if it’s not working, it is fairly easy to install the noveau drivers with apt and also fairly easy to uninstall the NVIDIA driver with apt.

Keep a record of your configuration. It’s possible that you may temporarily lose the graphics display will need to reinstall driver from the command line.

Tried option 1 on my virtual machine using VMware workstation 16. It didn’t work and it turned out that the Virtual machine use a virtual Graphic card different from the phisical graphic card, which might cause the problem.

oh, I missed the part where you were trying to use this in VMWare.

We don’t have too much experience in this forum with that setup.

You may want to try plugging an external SSD or NVME into the computer and boot Ubuntu 20.04 directly off the external drive.

Although I didn’t test my laptop with the Linux streaming driver in dual-boot mode, I did manage to boot off an external 1TB Crucial X8 external USB drive and run Flutter development tools. I’m now curious if my laptop can run the Linux streaming drivers. Might be good for a weekend project. :wink:

Running Ubuntu from An External USB Drive on Lenovo Edge 15 | by Craig Oda | Medium

I tried using our Intel NUC (no nvidia card) mini PC for the OpenCV face detection little project in this website Setting up RICOH THETA Z1 · Husarion Docs

After I run the program, it shows:
lsgi@lsgi-nuc:~/cam/ws/src/facedetection$ python detect_face_video.py
VIDEOIO ERROR: V4L2: Pixel format of incoming image is unsupported by OpenCV
[ INFO:0] Initialize OpenCL runtime.

There is an OpenCV window showing the captured video, but the video shown is like a still image which change after more than 10 seconds. What could be the problem? Is the pipeline setting in gst_loopback wrong(already changed the 190 line)?

I found out that my previous OpenCV has no libu4l. So I reinstalled OpenCV 3.4.14 with
cmake
-DCMAKE_BUILD_TYPE=Release
-DCMAKE_INSTALL_PREFIX=/usr/local
-DENABLE_CXX11=ON
-DBUILD_DOCS=OFF
-DBUILD_EXAMPLES=OFF
-DBUILD_JASPER=OFF
-DBUILD_OPENEXR=OFF
-DBUILD_PERF_TESTS=OFF
-DBUILD_TESTS=OFF
-DWITH_EIGEN=ON
-DWITH_FFMPEG=ON
-DWITH_GSTREAMER=ON
-DWITH_OPENMP=ON
-DWITH_V4L=ON
-DWITH_LIBV4L=ON

Then run the opencv code with ROS Setting up RICOH THETA Z1 · Husarion Docs.
A new warning message occured
open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1

And the video shown is still like a image which change after more than 10 seconds

Very confused. gst_viewer runs smoothly with about 1 sec delay and 30 fps. But with gst_loopback and opencv videoCapture, the frame rate could be less 0.1fps, and changing 4k to HD here THETAUVC_MODE_FHD_2997, &ctrl); //ori THETAUVC_MODE_UHD_2997,
does not help with the fps. Something must be wrong.

Our goal is to use theta Z1 to capture and save 4k video(equirectangular or dual fisheye both ok) with timestamps. It will be better to write it to save it to ROS bag because we are using other sensors like IMU and Laser scanner. Do you have any suggestions especially for timestamps

Do you have qos=false in the pipeline of gst_viewer.c?

if (strcmp(cmd_name, "gst_loopback") == 0)
    pipe_proc = "decodebin ! autovideoconvert ! "
        "video/x-raw,format=I420 ! identity drop-allocation=true !"
        "v4l2sink device=/dev/video0 qos=false sync=false";

Yes. I changed it from begining before make. I noticed that 2K loopback is a little bit faster than 4K loopback when using VideoCapture in OpenCV.

I assume the difference beween gst_viewer and gst_loopback is that gst_viewer does not store data into memory, so gst_viewer in my x86 Linux runs smoothly.

Community member snafu666 posted a pipeline to save to local storage. One thing to be aware is that the microSD cards on the Jetsons may not be able to handle the storage bandwidth, so you’ll need to use an SSD.

Live Streaming over USB on Ubuntu and Linux, NVIDIA Jetson - #220 by craig

I’m not familiar with the ROS bag format.

Mobile phones use the Camera Motion Metadata Specification to store IMU data.

Hopefully, other people can provide insight into the ROS bag format and the simulated clock