I have a Z1 which I am using with stella_vslam with some success. The video settings on the z1 seem to be very lacking. I was outside on a cloudy day with some blue sky too walking through the trees and the trees are so dark you cannot make out color. In the video you can see color of the vehicles around, the color of the sky and a blue roof nearby but the the tress are so dark. I would have thought a 1" sensor would have done a better job or am I missing something?
See attached pic of the video.
Hi,
Could you please share a short video uploaded to YouTube or by sharing directly the file?
I’m a developer and I let users preset exposure compensation before starting to record video when using my plugin. I did 't notice such option in Theta app…
How does stabilization work for you while walking?
Thanks!
Have you adjusted the exposure compensation (EV) and white balance of the video?
To set exposureCompensation in video mode, I think you need to be in video mode. I believe the image and video settings are separate.
white balance can also be adjusted in video mode
The plug-in from @biviel can also do HDR video.
I tried to shorten the video I shot yesterday so I could upload it but do not have any software that supports that. Is there some opensource software to allow editing of 360 videos. I used Openshot but I can no longer pan in the video when played back. There is freezing rain going on right now so I can’t go shoot a shorter one at the moment.
The 4k video is not all that good for quality and stability. Its muddy over smoothed and not sharp at all. Once I run it through the Windows theta app to correct it into a equarectangular video and play it back in vlc it looks pretty bad. Slightly better in the Theta app. The 4k video coming out of my old Mavic Pro with a smaller sensor looks so much better in quality.
I thought the larger sensor in the z1 would have had better quality than this.
I did find the ev setting and had not played with it. As soon as I start recording the video blanks off the phone screen and I can not tell the ev needs changed. It looked ok when I started away from the trees. I had it set on auto and I assumed it had more dynamic range than it apparently does.
Does the hdr video plugin work on the z1? seems most stuff is for the X. I have not located the plugin yet. Could you provide a link?
You can do next:
-
upload as is to YouTube as Private video, use online studio there to cut video and make that short version available to check.
-
My plugin has lot of options and it’s still in beta phase, but functional. It’s mainly for live streaming, but it can record video too. Someone just streamed 10-11 hours with a Z1 using my plugin a few days back.
-
You need to register on flow.tours and after you login you enter camera configuration, where you can set “Plugin State” value to “Record” only. It will only record to local storage, but for now to be able to use my plugin you have to connect Z1 to a WiFi network before starting my plugin. My plugin when started will connect to flow.tours website which actually provides UI to control the plugin settings by users more easily… but if there is no internet connection at the moment when recording starts it will not work for now. Follow instruction, ,there is an older video with instructions too, but at that time “REcording” wasn’t available. Plugin will record video to internal storage of Z1
theta-plugins/plugins/tours.flow.hdrstreaming at main · ricohapi/theta-plugins · GitHub -
Some important notices, differences compared to in camera recording feature:
a) you can use different Video modes, like “Equirectangular”, “HDR equirectangular”, Dual fisheye, HDR Dual Fisheye. At the moment if dualfisheye is used, there are challenges to stitch the video efficiently with third party tools, as Theta DEkstop app is not compatible with it for some reason. HDR equirectangular provides the best quality image, BUT stitching is custom, as you may know every devices has a little difference in alignment, so I’m working on a process so that eveyr user would be able to adjust his/her own stitching for HDR video mode, but this features doesn’t work yet in plugin.
b) you can use H.264 but h.265 encoding too, this latest looks much better
c) recording time isn’t restricted, depending on environment temperature it can record for an hour or more even for hours, if there is enough storage.
d) highest FPS is 25 for now that is usable.
- Mavic Pro isn’t a 360 cam, so I’m not sure how are you trying to compare, you cut a portion of screen and you try to compare with a single 360 view, which is about ~720p resolution?
if you try my plugin you can also preset exposure compensation, feel free to try or reach out to me if you have any issues!
I don’t have high speed internet here so I was trying to avoid uploading a 2gb file as it will take a day and probably fail.
Thank you for the info on how to try out the plugin. Most of the areas where I want to use this will be where there is no internet no even on a phone. Deep in the mountains and also the jungles of the amazon. Do you have a time frame for a non internet required option?
My goal is to record HDR equirectangular that is as sharp and aligned as possible for use with stella_vlsam or something better in the future. You mention HDR equirectangular stitching is custom and you are working on an option. I assume you have a default that it goes to to stitch. In the future could opencv and a checkerboard be used to gain the info needed to use in a config file the plugin uses to allow for more accurate stitching?
And yes I know the Mavic is not 360 camera. I just meant for a cheap small 4k sensor in a turbulent environment it appears to have higher dynamic range than the 1" sensors of the z1.
Thank you for taking the time on answering my question. I look forward to trying out your plugin. I did not look at it before since it said live streaming and I did not realize it recorded also.
thank you
hi!
welcome! I think I better understand your use case now, thanks for explaining! I suggest you to try the plugin first to see if it provides you any improvement vs recording with in built process. I would suggest to do a short test video at the place where you still have internet on mobile phone which you can share to z1 and make sure it’s connected when you start recording with my plugin.
I’m thinking about implementing an option to be able to start recording video at least for some time after the last configuration was pulled from flow.tours, like 2 weeks or so. For this also I would have to implement at least some exposure control adjustments via z1 buttons, to increase or decrease exposure compensation, FPS may not be so importart to adjust, but EV yes for sure. I will think this through.
I’m thinking to provide a process and UI on flow.tours, where users could do some adjustments themselves and generate a custom stitching configuration that would be saved on flow.tours and in camera directly, so when user is live streaming or recording next time this custom stitching parameter would be applied. I hope I’m clear on this… The issue is that when HDR equirectangular mode is used, stitching is totally misaligned by inbuilt stitching mechanism on Z1, so I’m trying to correct it a little by modifying the camera texture on the fly while rendering it into video object. Another option is to just try to make some adjustments on my own to make this stitching correction when HDR equirectangular mode is used, better without manual intervention.
For now I would suggest you to try my plugin and record in both “Equirecntangular” and “HDR equirectangular” mode, but using H.265 encoding. I think even simple equirectangular may look better, sharper when my plugin is used. In this case the inbuilt regular Richo’s stitching is used as is.
Clearly if you can do stitching yourself in openCV or other tool, then recording in HDR dual fisheye could be superior in quality.
Try please and let me know if it’s good or can be good for you or not. Thanks!
I was able to upload the shortened clip to youtube of the dark video. OpenShot was able to edit the clip and save it but for some reason it stripped the metadata. I used Spatial Media Metadata Injector that I found on Github to put it back.
Video here:
https://youtu.be/n-D89bipCW4
I have the plugin installed and created an account on your site. Soon as I figure out how to connect the camera to the router here in the house I will give it a try both modes you suggest and see which one the slam program likes best.
thanks,
Thanks for sharing! Really the colors and sharpness doesn’t look as I would expect…
I forgot to highlight that you can also set bitrate on flow.tours, if recording only, try 30-40mbps in h.265 for better quality. Also you need to insert spatial metadata if you want to upload to youtube. 30FPS is experimental only, I think Z1 is not able to provide a stable 30 FPS video recorded in HDR or non HDR equirectangular mode, only in dual fisheye modes may work well. In theory if audio is off, it may work, not sure yet.
Connecting Z1 to wifi router or mobile hotspot, or to your mobile phone internet shared as hotspot may take some time an effort. Here is a tutorial for Theta V, but it’s very similar at Z1 even now. You need to connect it to your phone first and setup Access point name and password, etc. after you are done, you switch on Z1 the wifi to CL mode by pressing again the WiFi button. Be aware of that you may switch Z1 to use 2.4ghz or 5ghz wifi, so if you use a 2.4ghz wifi router only, 5ghz will clearly not be able to connect if z1 is set to it.
However there is one advantage on this video, while you were walking it’s stabilized… Once you recorded you drag and dropped into Theta Desktop app, so it was stabilized with it, right?
For some reason, Ricoh doesn’t expose their stabilization to 3rd party developers, so if I record a video by my own plugin and I try to drag and drop into Ricoh desktop app it won’t work, it will not be able to process is. Probably it’s missing some metadata information. Ricoh didn’t expose any information about how this could work, how a video recorded by a plugin could be used by Ricoh desktop app for processing it.
I wonder if it’s possible to get the live preview from the Z1 when the video is being recorded? It seems like the live preview works during recording from the app I tested in the video below. I only tested the X, but I’m wondering if it works with the Z1. I didn’t realize that the official mobile app screen still went blank during the video working.
Have you compared the metadata of the dual-fisheye from your plug-in to the metadata of the dual-fisheye video from the camera directly? It’s possible that it will stitch with just the metadata injection. I have not tried this.
Oh, that’s right. The format of the video data for topbottomcorrection is not dislosed. That might be why the video can’t be stitched…
yes, so metadata attached to the end or beginning of video file may not be enough here, as would need to store probably gyro/accelerometer data somehow. Clearly, if Ricoh would be generous on this, could specify a format, a simple file with timed gyro/accelerometer data that would be saved during video capture, so it could be provided near the video file to desktop app and it could use that file data instead of the same data being captured alongside the video directly.
@craig , I think this could make Theta devices superior in video capture too, because:
- my plugin could save in HDR dual fisheye mode + optionally gyro/accelerometer data saved in a separate file as agreed on format with Ricoh or as they provide it.
- Ricoh desktop app could use the video file and stitch in desktop app + optionally stabilize the video in case the file that includes gyro and accelerometer data is included near the mp4 file.
what is your opinion?
2.
I was able to get ti connected to the house router. Setting up the config for the app is pretty straight forward. I like how you did that. One question about it is what is: Worldlock Stabilization.
H265 does make for much smaller file sizes which is always nice.
I have not noticed where you turn off audio yet. For my application its not needed.
Yes it does look like the Theta App does the stabilization. I do not have the camera do any correction of the dual fish eye to equirectangular. I tried it once but there was a pretty good error at the meeting of the two lenses. Its a bit better using their Theta app.
I notice your output video is 3840x2160 with a banner at the bottom. Have you tried to output the video with the same bit rate encoding and size that normally comes out of the camera to see if the theta app would read it. I assume since you cannot get the imu data all it would do is correct a dual fish eye to a proper formatted equirectangular video.
I tried a couple videos in your app with different settings here in the house since the weather is still to wet to venture into. The hdr is really nice. I am shocked the camera did not have this by default. However the lack of stabilization leaves a lot to be desired. I will run them through the slam program and see if it cares about stabilization. I know having imu data with my homemade lidar scanner sure helps with slam.
A couple other nice things to have in the flow tours app would be a timer for starting so you could press start then move it into position. The second would be a beep or chime to let you know its actually started recording.
The offline function would be nice. If you ever get hit by a bus or no longer support the app we loose the use. Maybe make a offline option only for recording. Since really if your streaming you need network anyway.
I’m glad it worked for you. Try to turn on Worldlock Stabilization, but it works for now in non HDR mode only, so pick simple Equirectangular. I’m trying to process on the fly gyro data and built a custom algorithm to try to eliminate bigger rotational shakes while walking and also to keep orientation of the viewer “locked”. I think this feature was called Worldlock at another 360 camera brand too.
It seems not related to bitrate, but probably metadata or both, I didn’t find a tool yet that could easily copy whole metadata information from one video to another, honestly it could be shared by Ricoh. @craig , could you please try to ask them if they could provide some information about what is required by desktop app? I tried already several times, spent 1-2 days on it already, but couldn’t find a solution yet.
@3dmapmaker , I can get imu data, I’m using it for Worldlock feature too, but clearly saving it directly into mp4 on a way that would be recognized by Desktop app is impossible as it’s format is again not exposed by Ricoh and they may used a custom format for it. That’s why I suggest to try to provide an external file approach, so that material, like videos taken by plugins would also be able to leverage desktop app features (like stitching and stabilization). In this case, Dekstop app should be able to modify easily, to read imu (gyro, accelerometer and potentially magnetic) data from external file too. Z1 would rock for sure. I could make sure that stitching in HDR will be much better soon, or it could be stitched by desktop app directly too.
Yes, I agree. HDR looks pretty good… especially when recorded into h.265 encoding. I spent 10-12 months of effort on this plugin last 3 years all together. try to estimate an engineering effort and cost. Just try with 7-8000USD/month… At the moment I’m pushing my 360 streaming platform instead and only minimally maintaining the plugin, but planning an update soon. Lack of stabilization is an issue for sure. I use DJI Osmo to attach my z1, it works quite well, but clearly it’s a bit visible when viewer is looking at bottom.
I’m not sure I undesrtand the timer part, please explain a bit more.
I’m building a streaming platform, so Z1 will be supported as its very special for me. Its the only 360 camera that can live stream using SRT protocol, now that I made it work with it. This means that someone from US, should be able to stream directly to a server in Germany, without loosing much latency on the road and bandwidth is also fine from preliminary tests. Live streaming thorugh mobile data is very efficeinet using SRT vs RTMP. UDP vs TCP transfer is a huge difference in this case.
Yes, some kind of offline recording should be allowed I understand, but I need to cehck the effort. As at this point I will never be able to ask for the money I spent by my work invested. Any way I enjoyed working on it and with help of @craig and @jessearmandse and this forum I would never buy a Theta Z1, still there is no such camera at the moment. Insta360 Pro2 may provide similar quality during live stream, BUT its 10 times heavier and its BIG.
I’m not sure of the exact use case of @3dmapmaker , but I’ve heard this request before. People want to get out of the video to reduce the editing. If they are in the video or not in position, it makes it more difficult to to photogrammetry. Editing the file may also mess up the metadata.
Regarding the use of the official RICOH desktop app to stitch video from a plugin, I’ll ask if people have advice.
Thanks,Craig! Also if you could ask about gyro/accelerometer, imu data if it could easily work from an external file, it would be great.
OK. I want to set your expectation that I do not think it is possible to use the desktop app with an external data file for IMU data.