Thanks for hint to the gestreamer plugin - a theta gstreamer plugin (that is able to switch the device automatically on and to the right mode) would be one of the most desirable things for many projects…
I hope Linux detects the XBox Controller as joystick
apt install joystick
you could check with
ls /dev/input
and hopefully it shows up as js0 or js1 …
then a quick test with
jstest /dev/input/jsX
I just hardcoded js1 because on my computer even the accelerometer shows up as a joystick. Otherwise, the input velocity [line 471-472] of could just come from a timer (or keyboard, or in the case of a real robot over the network ).
e.g. comment the joystick code out and set the yOffset to 800 and set the xVel to 6. and then slowly it should pan from left to right, each timer-step the xOffset is incremented by xVel.
On Nano, I often have to reduce the resolution from 4K to 2K. If you want to switch back and forth between 2K and 4K for testing, you can change THETAUVC_MODE_UHD_2997 to THETAUVC_MODE_FHD_2997.
Example.
if (argc > 1 && strcmp("--format", argv[1]) == 0) {
if (argc > 2 && strcmp("4K", argv[2]) == 0) {
printf("THETA live video is 4K");
res = thetauvc_get_stream_ctrl_format_size(devh,
THETAUVC_MODE_UHD_2997, &ctrl);
} else if (argc > 2 && strcmp("2K", argv[2]) == 0) {
printf("THETA live video is 2K");
res = thetauvc_get_stream_ctrl_format_size(devh,
THETAUVC_MODE_FHD_2997, &ctrl);
}
else {
printf("specify video device. --format 4K or --format 2K\n");
goto exit;
}
}
The Xavier seems to handle 4k video transformations better, but at higher cost.
Thanks, I have the feeling that the nano should be able to handle 4K h264 hw accelerated decoding very well… My suspicion is that to use the nvidia decoders properly the data needs to be in the video RAM similar to … nvvidconv with the caps video/x-raw(memory:NVMM). Here we would need to see how to get the data from the app source there … just a guess, will try it out.
Oh, I think you’re right. Now that I think about it, the problem with the Jetson Nano at 4K is in the OpenCV processing or the DetectNet AI, not with the video frames itself.
I am trying to build a wheeled robot with a Ricoh Theta V installed on it to detect objects. Is it possible to turn on the camera automatically when the computer (NUC x86 ubuntu 18.04) is turn on?
Yes, plug the camera into the USB port of the NUC. The camera must be in a power off state, not “sleep”. If it is powered off, the camera will turn on when you turn on the NUC.
You can test this behavior quickly by unplugging and plugging the USB cable back in when the camera is in a “power off” state.
We reported the bug to RICOH a few weeks ago. I have no further information.
Some things to consider:
I do not work for RICOH and do not have information on the development plans
this site does receive sponsorship from RICOH and we do report community feedback back to management
My personal opinion is that I do not think there will be a fix to the V firmware for the USB API unless it is easy to fix. The reason I suspect this is that the THETA V is no longer in production and the Z1 works with wake from sleep from the USB cable.
I realize that this is unfortunate as the Z1 is considerably more expensive than the V. Again, I do not have any special information on RICOH plans. This is just my opinion.
Great to hear. Please post some pictures of your wheeled robot with the NUC. Feel free to start a new topic for what you’re doing with 360 vision on your robot. There’s some information on the robot hugues built here. He may have a more advanced version now.
Feel free to post a new topic as a build log of what you’re doing with the robot and detection algorithms. I’m curious if you’re using the live feed, video to file analysis, or still images. Also, there’s a question of whether the camera is usable using AI training with normal images or if you need to use a fisheye image database.
In order to experiment with the problems and solutions of robots, I built a toy robot myself and mounted the camera on it with a 1/4" by 20tpi bolt.
In it’s current state, the robot has a Raspberry Pi 4 for controlling the robot motors and lights and a separate Jetson Nano for managing the camera. The main reason for two controllers is that the robot toy kit had nice instructions to get it running with the RPi4. I I then simply attached the Nano to the back chassis with zip ties and duct tape. My robot is currently unstable because I’ve maxed out the rechargeable battery power supply current delivery. However, it is still fun to use for testing as I can get a small taste of the excitement of other robot builders.