Currently, I have two Theta V cameras connected to a Jetson Orin via USB cables. When I run ./gst_viewer or gst-launch-1.0 thetauvcsrc mode=4K ! queue ! h264parse ! nvv4l2decoder ! queue ! nv3dsink sync=false in two separate terminals, I can display live images from both Theta V cameras in two windows.
I am struggling to set up a pipeline on Jetson Orin that can handle live video as data (using uvcthetasrc or other sources). Additionally, when I connect the Theta V cameras to the Orin, they are recognized as /dev/media* instead of /dev/video*, which I believe might be the issue.
(Note: My ultimate goal is to display these live images as image topics in RViz on Ubuntu 22.04/ROS2 Iron. So far, I have tried ros2_thetav and theta_driver, but I have not been able to get them to work.)
Thank you very much for your prompt response. After various trials and errors, using the information from the link you provided, I was able to display live images from two Theta V cameras on RViz in ROS2 using the following pipeline and OpenCV!
I apologize for the delayed response due to the traditional Japanese summer holiday (Obon).
The attribution is absolutely fine. Thank you.
As for converting the feed from BGRx to BGR, to be honest, there isn’t any particular reason. I’m not entirely sure if this is the best approach. I tried it based on some information I found online, and it worked.
Regarding whether it’s necessary to convert the video to BGR format for input into RViz, I’m not certain yet because I haven’t had the chance to test other formats. If time permits, I’d like to experiment with formats like Planar YUV.
I believe I am running two instances of thetauvcsrc, one process for each camera. Since I’m running two ROS nodes, I assume both are being executed.