Hello. Is it possible to shoot dual-fisheye image by USB API?
It seems there’s a way to do this with v2.1 Wi-Fi API, but I couldn’t find one from USB API.
Hello. Is it possible to shoot dual-fisheye image by USB API?
It seems there’s a way to do this with v2.1 Wi-Fi API, but I couldn’t find one from USB API.
hi @Siwook_Choi ,
may I ask why would you like to use USB API instead of WiFi connection?
Thanks!
I don’t think the USB API has that feature. I’m going to ask my contact at RICOH about this. Can you advise why you need the dual-fisheye? Is it for distance measurement estimate when converting to a point cloud?
I’m trying to make a handheld indoor scanner using THETA Z1.
I need following requirements for the purpose.
@biviel I’d prefer USB API due to the low reliability of wireless communications. (signal interference e.g.)
@craig The reason I need the dual-fisheye is for the low interval shooting.
I examined the minimum interval with equirectangular images but I could not achieve more than 1 image per 4 seconds due to the internal stitching.
Did you see that the THETA X can shoot video at 5.7K? What resolution of the images do you need?
Can you take video and then extract the frames?
https://devs.theta360.guide/ricoh-launches-new-theta-x-360-camera/
The THETA V used to have feature for Google Street View 5.4K video at 5fps. The frames were automatically extracted and send to GSV.
If this type of feature existed for the Z1, would it meet your requirement? If so, do you need the file to be transferred by the USB API?
hi,
@Siwook_Choi , do you have any requirement about image resolution? Is it required to store images in camera, with USB is it connected to a Raspberry PI or else? I guess you need to move around a place to “scan”…
Resolution doesn’t matter, but I’m afraid images can be blurry if I use videos.
Hopefully, I think I can call Web API through USB connection.
It’s using adb to forward the device’s web port.
I’m not sure whether it will work, but It seems worth trying.
$ adb forward tcp:8080 tcp:80 # forward host 8080 port to device 80 port
@Siwook_Choi ,
I’m finalizing a plugin update, it provides high quality live streaming, but trying to cover multiple use cases and requirements where this plugin could be used in theory. this is one test I did through mobile device network: Theta Z1 - Best live streaming 360 camera? - YouTube . In theory I’ve full control over FPS, etc.
A person is moving around with the Z1 in your case or a “robot” and need to take shots and save locally to internal storage of Z1? after those images are processed, etc.? I could make my plugin record video also to local in 5fps, but if you have a local media server to ingest rtmp or rtsp or low latency SRT , it will work, I’m not sure what do you do further in PC or raspberry with the input?
Be careful with the actual deployment workflow. Even if you get it working on your development workstation, test the following:
Think through whether the approach will work for your production use case.
As far as the dual-fisheye JPG image over USB connection, I reviewed your request with a manager at RICOH.
As the real voice of the community with the use case is very powerful, they are considering implementing this feature. However, it may take a long time. I know of another company that has a similar request with the USB API for similar reasons.
If you are considering a large deployment of Z1 cameras, please contact @jcasman, president of Oppkey and manager of .guide, by DM or email as we can send the information to RICOH as it may help to influence their management decisions.
In parallel, I suggest you consider independent community developer @biviel plug-in as he’s a good developer with good knowledge of the Z1 technology and media encoding.
Maybe you and @biviel from FlowTours can talk privately over video conference as community member to community member to exchange technical ideas?
Note:
Hi @biviel, I’ve sent you a DM. Thank you!
Indeed, the forwarding requires unlocked cameras which cannot be used for the deployment.
What I’m doing now is a feasibility testing, and if it seems okay, I should develop a plug-in as you suggested.
You can access the webapi from within the plug-in, either directly or using something like theta4j.
This article is old, but it may still work.
In the current plug-in architecture with the Z1, you cannot access the USB port from the plug-in. You would need to use the camera itself to transfer the images to your cloud or you can stop the plug-in and then use the USB interface to transfer it to a small computer.
I believe that you can start and stop the plug-in from the USB API, but I have not tried.
Your app can use this to see if the plug-in is running or not:
FYI, this feature added in latest Z1 firmware update(ver 2.10.3 - released 2022-05-31).
documentation about this feature is not updated yet (0xD834 Image Stitching).
Thanks for the report!
@dhp,
are you sure it works for Z1? At support models I see “-” at Z1:
Did you test this, if yes how?
thanks!
I believe he indicated that he tested the Z1, but that the API documentation was not yet updated. I haven’t tested it myself yet.
@craig yes, i’ve tested z1
@biviel following commands (gphoto2 on ubuntu) will take 2 photo with different stitching mode
root@teevr-NucBox5:~# gphoto2 --get-config=/main/other/d834
Label: PTP Property 0xd834
Readonly: 0
Type: MENU
Current: 1
Choice: 0 1
Choice: 1 2
Choice: 2 3
Choice: 3 4
Choice: 4 6
Choice: 5 7
END
root@teevr-NucBox5:~# gphoto2 --trigger-capture
root@teevr-NucBox5:~# gphoto2 --set-config /main/other/d834=2
root@teevr-NucBox5:~# gphoto2 --trigger-capture
Thanks for sharing! Where are you doing stitching why do you need dualfisheye. May I ask your use case?
Do you know if taking a picture with the USB API without stitching results in faster time between images?
I have seen this requirement before for image overlays on sensor data.