I managed to live stream my theta from my PC to my phone and VR device as a 360 spherical image and even put them in VR/cardboard mode, using a webrowser.
The details are explained in the links, it works similar to skype or zoom.
Is it possible to reduce the framerate of the Theta?
All my tests have been with 2K, when I try it with 4K, the sink of the pipeline has an increasing delay over time.
I think it takes too much time per frame due to the increased resolution. Or is it possible to drop frame if they’re too old?
Gst_viewer has a minimal delay for 2K and 4K, however.
It comes in with 4K, it starts with barely any latency however the longer it runs the higher the latency becomes. It did not matter with what I opened the v4l2loopback device (vlc, webpage, opencv).
The increase in latency happens once the v4l2loopback is accessed. It did not matter how long gst_loopback ran beforehand. Though the increased latency persists between sessions of accessing the v4l2loopback and does not reset. It resets if I stop and rerun gst_loopback.
If I run THETA MODE UHD(4K) with gst_viewer it comes in fine without latency.
I can do some tests about latency and performance of the AGX Xavier compared to the Jetson Nano as I have both, but the Gstremer pipelines I used to display and record the video do not seem to work on the Jetson AGX Xavier
My gst_viewer is working but with vlc i can only get 1 frame then it just sits there. My webcam works fin with everything. I tried getting a video but it can only record 1 frame and the video won’t capture. Need to get this working ASAP i have been trying for days and I’v come so far to get it working up to this point. Seems like there is something wrong with v4l2loopback and the frame rate/resolution. This is a brand new install of Ubuntu 20.04 so all libraries are new.
Also, you may be using software rendering, not hardware accelerated rendering. Post info on your GPU setup and also if you are using decodebin or nvdec.
You can also more questions, but just to let you know that there’s a bunch of information in the doc available here https://theta360.guide/special/linuxstreaming/ and there is search capability on that document.
Again, no problem if you keep asking questions. Just trying to help you out.
Hey Criag, Need your help while am trying to get live streaming for the Theta S 360 camera on Ubuntu 18. I followed steps to get gstviewer up and running and camera is also connected to the USB drive but not sure why but getting error ‘Theta not found’ while I run ./gst_viewer after installing libuvc-theta-sample. Any help, would be appreciated. Do you have all steps to be followed when live streaming of a Theta Camera is concern on Ubuntu 18?
I am trying to find a 360 deg camera suited to the Nitrogen6X board (1GHz quad cord ARM Cortex-A9 with 1 Gb ram) which we use. This is running Ubuntu 18.04. We wish to overlay our measurement from another sensor on a 360 deg camera feed and display it as a equirectangular stream. Is the theta S or theta V better suited to this (I understand the S offloads some of the image processing to the acquisition computer, which is in our case, low spec and that the V does this on board but might require newer OS and/or libraries which we may be unable to upgrade to). I could be mistaken in these understandings, which is why I am asking. Ideally we want to tax the processor on the Nitrogen6X board as little as possible and want to crack open the camera feed and overlay our data before displaying it.
output of the S in dual-fisheye format which you need to stitch yourself on the ARM board
S output is motionJPEG
V output is H.264
V can stream at 4K. S can stream at 2K
To use the S, the Linux kernel can handle it out of the box.
To use the V, you need to use the drivers documented on this thread and this site.
We have more examples running Ubuntu 20.04. I have not tried it with 18.04. Hopefully, it will work without additional modifications.
If your application benefits from a dual-fisheye feed, then the S might be easier for you to use. It should be lower cost as it is old.
You can try and download and compile the driver with the 18.04 before you get the camera. If you can compile gst_loopback and the sample code, then suggest you buy the V from a place with a return policy and test it soon after you get it.
You won’t be able to run the sample code without the camera as it looks for the USB ID of the camera when it first runs.