I have also tried to get live streaming via opencv in Python and the terminal outputs
Gstreamer warning: Embedded video playback halted; module v4l2src0 reported: Internal data stream error.
Hello Folks,
Would you have a Gstream pipeline to record the theta v video in 30fps with a Jetson AGX Xavier? I am using openCv script but the latency is hugeā¦
See the last one with H.265 from Nakamura_Lab as that one may be more similar to your use case.
As other people have pointed out, you may benefit from an NVME storage and not the microSD card as the microSD IO may be a bottleneck when storing the frames.
Hello Everyone,
In order to avoid latency while recording with 2 theta V I was wondering if there is a way to record videos directly from the theta V in an external storage in order to grab the video later ? (the internal storage is way too small)
Thank you @craig , I used the pipeline and it works actually pretty fine apparently without any frames loss however it stops recording after exactly 20 secs every time.
Is it because of the buffer size ? would you have any idea please?
PS : I am using an NVME storage indeed
Hi guys, my setup is very similar to several people above - Iām streaming a Ricoh Theta V connected to a Jetson Xavier via USB. Iām using the thetauvcsrc from Nickel110 combined with a gstreamer pipeline based on this handy example.
I can view the stream in 4K in DirectX on my WLAN connected Windows machine using another simple gstreamer (for Windows) pipeline, and there is an acceptably small latency (<0.3s I would guess).
Instead, however, I want to render the stream into a skybox in Unity on a Windows machine, in a similar way to this example, but with an RTP stream, rather than a Theta that is directly connected to the Windows machine via USB. I would obviously like to keep latency to a minimum still.
Does anyone have this final piece of the puzzle? Or have any ideas about how to go about it?
My best ideas so far are:
Use VLC, OBS or Gstreamer to āloopbackā (if thatās the right term) the RTP stream to a dummy Windows webcam device (as per v4l2loopback ) and then use an identical script to the Unity example above.
Access and decode the H264 encoded RTP stream directly from a C# script in Unity and render that to the skybox texture - is that possible?
Use a different pipeline on the Xavier side to make the Windows/Unity side simpler, or decode the stream on the Windows side using gstreamer and somehow pass that stream to Unity for rendering.
If you have any thoughts or ideas then let me know. I will also post an update if I manage to succeed!
The virtual camera support of Unity may be a little tricky to get working. See this last post. A community member sent me information that using the older Unity plug-in for virtual camera support might work.
Can someone please explain to me what are plug-ins useful for?
This one for instance : Gstreamer theta uvc plugin
In order to enable live streaming over USB on Linux I followed @craig video tutorials that uses v4l2loopback, and thetauvc, it works well. Did I miss something ?
That is a plug-in for Gstreamer, not a plug-in for the camera. The gstreamer plugin allows the use of THETA live streaming without using v4l2loopback.
Some systems donāt work with v4l2loopback. In these cases, the gstreamer plug-in (software to plug into gstreamer) may make it easer to use things like OpenCV with the live video stream.
You may get lower latency with the gstreamer plug-in because the video does not have to go through the loopback.
I was wondering something :
I am extracting all the frames as jpg images from the mp4 file recorded with Gstreamer and deleting some offset frames. Then I want to reconstruct a video from a given frame.
When inspecting the video file it says it is 30 fps but on the constructor website it says 29,7.
I donāt know which info to take to reconstruct the video from the given frames. Also do you think that the frame rate is 100% constant ?
At least letās say if I donāt even use Gstream and record in the internal memory of the cameras, would it be constant ? and at which frame rate, thank you !
Thank you @craig . Let me explain my issue then.
I record two videos simultaneously using Gstreamer with 2 Ricoh Theta V (using v4l2loopback).
Then I programmed a āflashā sign periodically every [letās say] X tens seconds and lasts 33ms.
Once done I extract the frames from the two videos(when inspecting both contain 30fps according to ubuntu video properties) in the order of the recording, I delete the frames before that flash sign so my first frame in my cam0 and cam1 correspond to the frame containing the flash and reindex the order of the frames.
However, when I inspect the second flash it does not correspond to the same index in the two cameras (usually around 20 frames of delay).
which means that when corresponding the indexes of first flash, the second one does not correspond to the same image indexes in the two cameras.
I am using a Jetson AGX Xavier, 2 Ricoh Theta V, v4l2loopback and recording using Gstream pipeline (last one discussed in this thread) and storing directly in an EVO disk that has 7000MB/s (7GB/s) writing speed.
Do you have any idea where does the issue come from, please?
hi @craig . everything works well as your tutorial to stream thetaZ1 except the very last part with gst streaming. when i run the command there is a blinking live sign on the camera but nothing happens on my computer like the steam appears on your screen. i have installed the v4l2loopback but maybe im not using it properlyā¦ also the tutorial mentions to edit line 190 from the gst.c file but exactly what to do is absent. can you please guide im am new to both linux and theta. looking forward
@craig thanks for reply but it did not work for me. can you please guide me if there is something I may be doing wrong. I am using virtual box for ubuntu 20.04 while trying this code