I am a real estate photographer that is trying to get into the virtual tour game. I am trying to edit HDR Rendered 360 pictures on Lightroom Classic but I cannot use the Theta Stitcher plugin because it is not a DNG file. When I upload the pictures to KUULA, there are noticeable and sloppy lines where the picture should be stitched together. I was wondering if there were any solutions to this? I know I can take raw pictures, but the HDR Rendered come out so much better. Any input or help would be great!
I do not know if the settings below have any impact on the internal HDR. I do not think the settings below can be adjusted with the official mobile app, but @jcasman and I looking into building a mobile app to make these adjustments. I’ve been waiting because I’m not sure what “dynamic” stitching is what problems are solved with “static” versus “dynamic” stitching.
I have tried the DNG bracketed photos but I found that they still do not come out nearly as crisp as the HDR Rendered pictures…especially when it comes to blown out windows. Here are two pictures that show the stitching lines. They aren’t major but I want to make sure the pictures look as professional as possible before I start to offer them. The Theta stitcher plugin that I use after editing in Lightroom will not let me upload due to them being JPEGs and not DNG files. I tried to convert and safe the pictures as DNG’s and than uploaded them to Lightroom again to try and open in the plugin but it is saying that the original needs to be a DNG.
Thank you for posting the picture. That stitch line is more noticeable than you should see with standard HDR (done inside the camera). I don’t have that line on my Z1 HDR (internally created) images. Are you viewing the images with the desktop browser?
Did you just get the camera? If so, have you upgraded the camera to the newest firmware with either the desktop or mobile app?
If you want to pursue stacked DNG for real estate, here’s an example:
If you have facebook, you can see the 360 image here and assess the color of the window scene.
The THETA stitcher for lightroom is only used if the input is DNG.
There are a number of options for this type of real estate photography.
Use internal HDR with tripod. The output is JPG. You can quickly get this up on your tour site. This is the easiest
Timeshift plug-in with tripod. You can quickly take pictures using one lens on each side and then the camera will merge the pictures.
Single DNG. This provides more light info than standard JPG and also has reduced ghosting compensation issue of HDR (which might not be relevant for real estate). You need to stitch the image. The output of the stitching is JPG. The input is a DNG file.
DNG HDR for even more light info. You need to merge the files inside the camera into a stacked DNG. This is automatic witha camera plug-in.
I think other people can provide more information on the best technique.
There’s a lot of different options, so I think it can be confusing.
You are welcome! The strange thing is that the line isn’t consistently as distinct for every picture. For example, I can’t see the line at all in this picture which I took along with the previous 2. No, I use the Kuula program to view all of my pictures and to create all of my virtual tours. Is there a better program that works better with the Theta Z1?
I have had the camera for a few months and as far as I know my firmware is up to date. As of yesterday it was update 4.10.1 and the app update was 1.29.0.
I discovered that this line occurs only AFTER I upload and edit it in Lightroom. Attached are the same picture. The first one is without editing in Lightroom and the 2nd is after editing. Any idea as to why?
We want to get static stitching, in hdr mode. But in our tests, we get that in hdr mode, stitching is auto, no matter what parameter is set. If HDR is disabled, stitching works correctly, according to the specified parameters. We want to understand why this is happening?
We are using Ricoh Theta Z1 with firmware 2.11.1 Requests are sent via WebAPI
@ravil thank you for your tests and the pictures.
Z1 fw 2.11.1, WebAPI. summary: _imageStitching parameters have no effect when _filter is set to hdr. stitching is also set to auto.
Can you advise on the use case for this?
My idea on communication process:
@jcasman and I will ask RICOH if this is the “expected” behavior or if it is something that is “unexpected”
If it is “unexpected” and can be corrected by “API configuration”, then we will report back to you here.
if it requires firmware modification, they need to decide if they will fix it and in what timeframe
If the process reaches the third step, your use case would be valuable information in the decision process. You can also send the use case directly to jcasman@oppkey.com
@ravil I got your emails with more details, thank you. As @craig mentioned, we will report this to RICOH and try to understand if this is expected behavior or not.
Can you upload the original image from the Z1 for two scenarios:
no filter, _imageStitching static
hdr filter, imageStitching static
You will need to upload to a cloud service such as Google Drive. The file is likely too large to put directly into the forum.
Please either put the links in this forum or send the two links to jcasman@oppkey.com by email or by DM.
Also, have you considered stitching of none or the dual-fisheye raw and then build the 3D reconstruction from the dual-fisheye images. This will avoid the stitching differences.
Hi @craig,
I’ve sent you the photos you asked for via e-mail.
Yes, we did consider dual fish-eye in the beginning, but gave up this idea for the following reasons:
The camera (Z1) does not seem to support dual fish-eye in HDR mode. When we tried to take HDR dual fisheye, the camera made a sound and shut itself down. Please do let me know if there are plans to include this option in the future
Taking 3-5-7 photos with different exposures and downloading them to a mobile device for further HDR stitching takes too much time for us. The time limit for our application is to return result within 10-20 sec.
Performing 3D reconstruction from dual fisheyes directly would require intrinsic/extrinsic calibration for every lens for every individual camera. We can do this in our lab (I think…), but users of our application may not have access to calibration boards, lighting etc.
My understanding is that calibration parameters for the lenses are not disclosed. Do let me know if I am wrong, knowledge of those parameters would help me a lot
May I ask – what model would you recommend for Z1 lenses? Is Scaramuzza’s model sufficient for such a wide angle?
Thank you. We discussed your problem with a RICOH product manager yesterday and we discussed that the best course of action is to send the original photos to engineering for advice.
Oh, we forgot about the lack of dual-fisheye in HDR mode. I understand now. Thanks for reminding me.
this is understandable.
This is understandable.
The calibration parameters are not disclosed. It’s not possible to get these parameters.
We can informally ask some of our contacts at RICOH.