Hi, thanks for sharing all this work! It closely resembles the use case that I am thinking of, and I am hoping that someone in this community will be able to help me clarify some aspects.
I am working on a tele-operation system, where I plan to place a 360 camera with a remote robot arm, and live stream the images to a VR headset. The VR headset will track the users hands, and this hand movement will be used to control the robotic arm. Since I am still in the research phase, it is ok for me if the remote robot arm is actually still physically close to the user (i.e. at a distance where I can run a cable to the VR PC). For this specific setup, I have 2 questions:
-
Obviously, latency is a huge question here, since I hope to test object manipulation tasks (like the user picking up a fragile block in this tele-operated setting), so pushing it down to 50 ms would be ideal. It seems like Jake’s solution is the lowest latency option so far, but my setup does not necessarily require wireless communication. Can I expect a decrease in latency if I would livestream the 360 images via cable to my VR PC?
-
Has anyone been successful in getting Jake’s solution to run on a Theta Z1, and did that do anything to improve latency/fps/resolution?
Thank you very much for your generous open-sourcing of this project!
Best,
Femke