Failed to run RTSP plugin stream with gst-launch

Hi there,
It’s OK when check stream playable with ffprobe but
I run into a problem when trying to check stream generated from your theta plugins with gst-launch-1.0 (version GStreamer 1.16.3). The pipeline was hang with the command below:

 gst-launch-1.0 uridecodebin uri="rtsp://192.168.0.140:8554/live?resolution=1920x960" source::short-header=1 ! queue ! videoconvert ! x264enc tune=zerolatency ! h264parse ! matroskamux ! filesink location=check_compatible_cam2.mp4

Do you have any ideas? Thank you so much.

what model of the RICOH THETA camera are you using?

1 Like

I’m using this model

with this plugin:

I tested the plug-in today on Sept 13, 2024 and it seems to work fine with VLC.

image

Do you have a camera with a serial number YN35*** ?

It may not work with those units:

https://topics.theta360.com/en/faq/c_00_z1/8013/


I also have it working with ffmpeg, ffplay

I’m using this:

ffplay -fflags nobuffer -i rtsp://192.168.2.101:8554/live?resolution=640x320

Can you use the command above instead?

@Hiep_Tran_Tien1 I can’t use the command above.

Instead of the RTSP plug-in, can you connect the THETA Z1 to a USB cable and try using this:

https://github.com/codetricity/gstthetauvc

Alternately, a development version of the HDR Wireless Live Streaming Plugin may support RTSP.

You can try contact the developer and see if the developer has an update.

2 Likes

@Hiep_Tran_Tien1 can you provide more information on your use case?

The source code for the RTSP plug-in is here:

You may be able to inspect the source code and figure out what stream it is sending, then capture the stream with gstreamer.

If you want something easy to get a motionjpeg stream to another camera, you can use the motionjpeg stream of the Z1.

The available livePreview format for the Z1 is available here:

https://github.com/ricohapi/theta-api-specs/blob/main/theta-web-api-v2.1/options/preview_format.md

For RICOH THETA V or Z1

{"width": 1920, "height": 960, "framerate": 8} *1

{"width": 1024, "height": 512, "framerate": 30} *2

{"width": 1024, "height": 512, "framerate": 8}

{"width": 640, "height": 320, "framerate": 30} *2

{"width": 640, "height": 320, "framerate": 8} *1

*1 firmware v1.00.1 and firmware v1.10.1 or later

*2 firmware v2.21.1 or later

You can use a USB cable to a small board computer and then restream the output.

Using a USB cable

In gst_viewer.c
pipe_proc = " rtph264pay name=pay0 pt=96 ! udpsink host=127.0.0.1 port=5000 sync=false ";

with gst-rtsp-server

./test-launch "( udpsrc port=5000 ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264 ! rtph264depay ! h264parse ! rtph264pay name=pay0 pt=96 )"

receive on ROS

GSCAM_CONFIG="rtspsrc location=rtspt://10.0.16.1:8554/test latency=400 drop-on-latency=true ! application/x-rtp, encoding-name=H264 ! rtph264depay ! decodebin ! queue ! videoconvert"  roslaunch gscam_nodelet.launch

Save to file

gst-launch-1.0 v4l2src device=/dev/video99 ! video/x-raw,framerate=30/1 \
! videoconvert \
! videoscale \
! avenc_huffyuv \
! avimux \
! filesink location=raw.hfyu

or with h.264 encoding

gst-launch-1.0 v4l2src device=/dev/video99 ! video/x-raw,framerate=30/1 \
! nvvidconv \
! omxh264enc \
! h264parse ! matroskamux \
! filesink location=vid99.mkv

saving on x86

$ gst-launch-1.0 v4l2src device=/dev/video2 ! video/x-raw,framerate=30/1 ! autovideoconvert ! nvh264enc ! h264parse ! matroskamux ! filesink location=vid_test.mkv

If you just quickly want to test opencv

2 Likes

About using a USB cable. I will give it a try soon.

My use case:

  • I have an application based on deep-stream SDK
  • I wanna use rtsp from theta for my app
  • The application use gst-launch-1.0 to check if rtsp stream can run along with deepstream or not

Can use try like this?

docker pull restreamio/gstreamer:2024-07-19T10-11-18Z-prod-dbg

gst-launch-1.0 uridecodebin uri="rtsp://192.168.0.140:8554/live?resolution=1920x960"  source::short-header=1 ! fakesink

The result should be like this, it means the stream can run along with deepstream

@Hiep_Tran_Tien1 I know that @craig is traveling today, may not respond quickly. I’m just following this thread, curious about one detail. Deep Stream SDK is from NVIDIA?

I got a description from here:

NVIDIA’s DeepStream SDK is a complete streaming analytics toolkit based on GStreamer for AI-based multi-sensor processing, video, audio, and image understanding. It’s ideal for vision AI developers, software partners, startups, and OEMs building IVA apps and services.

  1. Can you use Deep Stream SDK without NVIDIA Metropolis?
  2. What analytics from Deep Stream are you using?

I’m most just curious. I have not heard of Deep Stream until today.

1 Like

Yes, you can use the NVIDIA DeepStream SDK without needing to be part of the NVIDIA Metropolis program. The DeepStream SDK is available for developers to download and use for building intelligent video analytics (IVA) applications. It provides the necessary tools and libraries to process video streams, integrate with AI models, and perform tasks like object detection, tracking, and classification, using NVIDIA GPUs.

This is my use case, all inputs are RTSP cameras. But somehow, some cameras are not really supported (can not decode).

Sample pipeline reference (DS)
image

@Hiep_Tran_Tien1 It looks super cool. :slight_smile: I have no experience using DeepStream, so I do not have any immediate comment.

If you take a normal webcam (like a logitech webcam) and plug it into a Linux machine with USB, can you then use gstreamer on the Linux machine to get it into your pipeline over WiFi using RTSP?

If it works with a normal webcam connected tot he Linux machine with a USB cable, you can then try it with the THETA Z1.

In summary, can you eliminate the use of the RTSP plugin by replicating the functionality on a small board computer?

1 Like

@craig @jcasman Many thanks. I will give it a try.

1 Like