Hi there,
It’s OK when check stream playable with ffprobe but
I run into a problem when trying to check stream generated from your theta plugins with gst-launch-1.0 (version GStreamer 1.16.3). The pipeline was hang with the command below:
@Hiep_Tran_Tien1 I know that @craig is traveling today, may not respond quickly. I’m just following this thread, curious about one detail. Deep Stream SDK is from NVIDIA?
NVIDIA’s DeepStream SDK is a complete streaming analytics toolkit based on GStreamer for AI-based multi-sensor processing, video, audio, and image understanding. It’s ideal for vision AI developers, software partners, startups, and OEMs building IVA apps and services.
Yes, you can use the NVIDIA DeepStream SDK without needing to be part of the NVIDIA Metropolis program. The DeepStream SDK is available for developers to download and use for building intelligent video analytics (IVA) applications. It provides the necessary tools and libraries to process video streams, integrate with AI models, and perform tasks like object detection, tracking, and classification, using NVIDIA GPUs.
This is my use case, all inputs are RTSP cameras. But somehow, some cameras are not really supported (can not decode).
If you take a normal webcam (like a logitech webcam) and plug it into a Linux machine with USB, can you then use gstreamer on the Linux machine to get it into your pipeline over WiFi using RTSP?
If it works with a normal webcam connected tot he Linux machine with a USB cable, you can then try it with the THETA Z1.
In summary, can you eliminate the use of the RTSP plugin by replicating the functionality on a small board computer?