180 degree panning check using Theta Z1

Hello.
I would like to ask about official stitching algorithm.
If someone know good information, please teach us!

I used Theta Z1 (R02022).
Now I tried “180 degree panning check” like as follows.

  1. Shot one photo. (RAW + JPEG mode)
  2. Pan 180 degree.
  3. Shot one photo. (RAW + JPEG mode)

For panning, I used following tool. (panorama pan base)

After that, I compared fisheye images generated from RAW (DNG) data
between No.1 and No.3 shots that are facing the same direction.

The result is following.
Please compare using “WinMerge” tool for example.


I could see some small difference (lens distortion?) between
two fisheye cameras but the camera direction is almost same, I think.
From this result, it seems that “panorama pan base” seems to pan Theta Z1 camera somewhat accurately.

After that, I generate equirectangular image using
RICOH THETA Stitcher Windows application.

Top/bottom correction setting set to manual, and Pitch/Roll are set to zero.
For Yaw setting, 0 deg for No.1 shot, and 180 deg for No.3 shot.

The result is following.
Please compare using “WinMerge” tool for example.


I think that on equirectangular comparison, Pitch direction difference was much bigger than fisheye comparison.
I’m not sure why…
Was RICOH THETA Stitcher app added some user-unknown distortion??
If someone know the reason, please let me know…

I tried same trial on movie mode too.
The behavior looks very similar to above.

In fact, I tried to use Theta camera generated equirectangular image for 3D reconstruction.
(Equirectangular image is very useful for robust SfM (Structure From Motion), I supposed.)
But for such purpose, user unknown distortion can be affect to the 3D reconstruction quality.
So I would like to know in detail…

Note that I tried this only our one Z1 camera, so the behavior can be changed if you use other Z1 camera, I supposed.

If someone has information, please help us!!

1 Like

I’m not quite sure about the question. I

Do any of these topics affect your usage?

auto-levelling topBottomCorrection

https://github.com/ricohapi/theta-api-specs/blob/main/theta-web-api-v2.1/options/_top_bottom_correction.md

visibility reduction

https://github.com/ricohapi/theta-api-specs/blob/main/theta-web-api-v2.1/options/_visibility_reduction.md

static vs dynamic stitching

https://github.com/ricohapi/theta-api-specs/blob/main/theta-web-api-v2.1/options/_image_stitching.md

lens distortion parameters

There is an undisclosed lens distortion property.

people have built stitching apps that allow manual calibration each camera lens without having lens distortion property values.

image

The technique is to align objects such as a tree branch manually at the edges of the spheres. Then, save the lens calibration into a settings file on the stitcher.

The app above is no longer available.

1 Like

Hi Craig-san,

Thanks a lot for your reply.
For easy understanding, I made gif animation.

On fisheye image, pitch (vertical) direction difference looks not so big.
fisheyediffS

But on generated equirectangular image (top/bottom correction was set 0 on RICOH THETA Stitcher App), pitch (vertical) direction difference became big.
eqdiffS

From this result, I felt that some “hidden” automatic correction works in the App… (I’m not sure…)

In the most recent example, is the output from the DNG/RAW image?

Did you create the animation with two images and the camera rotated 180 degrees on the tripod using the neewer rotating base?

I’m going to ask other people about this. I’m just trying to clarify the question and test environment first.

Note that the US has a major holiday, Thanksgiving. This normally includes Thursday and Friday.

Is the use case to use the DNG/RAW images from the Z1 for 3D reconstruction?


Steps in Test

  1. mount Z1 with latest firmware on tripod on top of rotating base

image

  1. take first DNG image
  2. rotate theta 180 degrees using rotating base
  3. take 2nd DNG image
  4. using RICOH official desktop stitcher stitch first image with Yaw 0
  5. using official desktop stitcher, stitch 2nd image with Yaw 180

  1. compare first and 2nd images (both produced from DNG)

Hi Craig-san,
Thanks a lot for your reply.

In the most recent example, is the output from the DNG/RAW image?

Yes, correct.
I shot DNG (RAW) file. (I used “darktable” application to convert it to jpeg.)

Did you create the animation with two images and the camera rotated 180 degrees on the tripod using the neewer rotating base?

Yes, you are correct.
“Step in Test” description in your reply was also correct.

Is the use case to use the DNG/RAW images from the Z1 for 3D reconstruction?

Yes, right.
I would like to use equirectangular images for 3D reconstruction.
I’ll also plan to use equirectangular video for 3D reconstruction too because video will be very useful for shooting many images in large area.

Note that the US has a major holiday, Thanksgiving. This normally includes Thursday and Friday.

Understood.
Thanks a lot for your great help!

If you use video, you may want to look at the THETA X as it can take video at 8K 2fps and other framerates. The THETA X 8K video also has IMU data accessible in the metadata in an open specification.

Regarding the still image DNG tests, I will reach out to other people and see if they have more information.

Hi Craig-san,

Thanks a lot for your reply.

If you use video, you may want to look at the THETA X as it can take video at 8K 2fps and other framerates.

Thanks for information. Unfortunately, we do not have THETA X now…
I’ll check if we can get THETA X.

For THETA Z1 video, similar to DNG file, I checked following but situation seems to be similar.

  1. mount Z1 with latest firmware on tripod on top of rotating base
  2. Start to record 4K movie.
  3. After several seconds, rotate theta 180 degrees using rotating base
  4. After several seconds, stop 4K movie recording
  5. Make equirectangular movie by RICOH THETA official Windows app. (top/bottom correction disabled)
  6. Compare fisheye movie frame snapshot and equirectangular movie frame snapshot.
    For equirectangular frame snapshot after rotating, we can use “Hugin” (https://hugin.sourceforge.io/) for changing Yaw angle.

movie_fisheyediff

movie_eqdiff

Regarding the still image DNG tests, I will reach out to other people and see if they have more information.

Thank you so much! It will be very helpful.