HDR Wireless Live streaming(Flow Tours) - 4K resolution in YouTube Live

@biviel member from community asking about shooting HDR video with the Z1. I pointed them to FlowTours. You may want to provide the person with additional information.

Set RICOH THETA to Stitch Video Inside the Camera - YouTube

Another person asking about Wireless Live Streaming to OBS. Maybe he can use your FlowTours plug-in instead?

RICOH THETA Live Streaming Tutorial with OBS - YouTube

1 Like

Thanks, I added comments there!

  1. Recording video

HDR video can be recorded by my plugin: in equirectangular and dual fisheye modes, both h.265 and h.264 encoding works. At first there is a stitching issue, I may come back to it and fine tune at later point, when saving in dual fisheye mode, it can be stitched after in Mistika VR/PTGUi combo, both are needed the first time and also MistikaVR can stabilize the video. I’m planning to demonstrate this workflow once I get there. :slight_smile: it really looks amazing, but MistikaVR price is quite high. There is a monthly subscription option to try.

  1. Live streaming to OBS - this is not possible, also Theta X is not supported yet by my plugin, work in progress. The issue here is that we can publish to an RTMP server, but inside OBS there is no RTMP server running. Actually when using my plugin there is no need for OBS, can stream directly to any platform that supports RTMP.

How does Mistika VR work with PTGui on the videos? I though PTGui was only for still images.

Also, doesn’t your plug-in save the live stream to the camera in equirectangular video format?

Or, are you saying that the HDR video has an imperfect stitch because it is a stream and the in-camera stitcher has some problems with the stitch? Are you saying that you want to improve the stitching to be comparable to what the RICOH desktop app can do to stitch the dual-fisheye video.

hi,
PTGui can be used to create a correcte, close to perfect stitching. Saving that project file and importing it into Mistika VR is the key here.

Mistika VR is able to stitch good the video that I record in my plugin in dual fisheye mode AND also adds stabilization, also it supports h.265 mp4 that comes from my plugin.

MY plugin can save in 24FPS in equirectangular mode, but the video looks exactly like during live stream. Clearly, can set video bitrate to 50mbps in h.265 it looks quite good. BUT stitching in camera has issues, also ricohs algorithm is used by me when rendering to equirectangular, but in HDR mode, there was bigger stitching issue, so I did some opengl operations to modify every frame a bit, but it’s still not soo good. I was thinking to implement a custom stitching mechanism, but still 24FPS may not be enough to some users.

Another option could be to get from RICOH some details about what is required to store in mp4 to be able to stitch it with Theta desktop app directly. I mean to stitch the mp4 that is recorded by my plugin.

I did some investigation and in theory I could store metadata on the fly, but not sure how would that affect FPS… Also I couls save even gyroscop and accelerator data, but not sure about the format used by Theta Desktop app. btw, does it use gyro and accell data? I think yes, like it does some kind of stabilization, isn’t it?

Clearly with my plugin I’m able to save record in HDR mode, and Ricohs internal recording can’t do that, it’s a huge quality improvement, especially indoor, but outdoor too when it’s not so sunny day. Also I’m able to record for an hour even or more, especially if h.265 is used as encoding.

The problem with Mistika VR is that it cost 60-80USD for a month and process is also a bit complex at the first time. I tried PTGui and was able to get quite good results…

The format is not disclosed. It does not appear to be CAMM format.

The desktop app has a feature called Top/Bottom correction which is video stabilization. The data it uses to do this is not disclosed.

Yes, I thought so… It’s a bit said as it could be really the best cam even for recorded videos. And I just found recently a potential option to record high res videos, going beyond 4k may also work, clearly with some lower FPS, BUT if recording in dual fisheye, it could work even in 25FPS or 30FPS in 5.7k. Once I’ve some time I will give it a try.

1 Like

hi @craig ,
good news! I was able to optimize my own broadcasting system behind flow.tours… With Theta Z1 I can deliver ~1 second latency live streaming, also at the same time to transcode to multiple different devices to watch. ~1 second latency on desktop , ~1 second latency on mobiles at the same time. Still I need to render it to camera as it’s 2d for now, but that will not have effect on latency.

This latency was measured by me when I streamed to a destination server in Germany, which is ~700 miles from camera location and also live stream I was watching, delivering to Hungary. So “stream” travelled ~1400miles end to end.

Actual HDR Wireless Live Streaming plugin version in store could do this with my platform behind flow.tours. I’m building now on top of this 2 way communication. And yes, it’s important to highlight that quality is still high, higher even than those that I streamed to YouTube… but isntead of 25-30 second latency its down to 1second now. Clearly, to view need higher and stable network, I’m still optimizing for watching on mobile devices via 4g or 5g mobile internet networks, so in some cases this 1 second can be 2 seconds…

I really push this so that EVERYONE would be able to try ASAP.

Laszlo

Laszlo

1 Like

This is great news. I’ve only used HDR Wireless Live Streaming with YouTube.

Is FlowTours back-end system open for testing?

I can grant you acces to try and see real quality of it. There are multiple workflows and optimisations.

Use case 1 stream to flow .tours and deliver to huge number of viewers by aws cloudfront cdn, this could start testing. I can grant you acces if would like to test. Here latency is 10 seconds but quality on both desktop and mobile is better than YT, can consume h.265 stream directly. Btw this can ingest AND deliver 8k directly too. Clearly 8k its not for z1, but viewing a 360 8k stream is a must seen thing and it works on majority of desktops when chrome browser is used.

Use case 2 stream to flow.tours with low latency, limited nummer of viewers (10-50, not sure yet), but latency is 1-2 settings depending on devices used to watch… work is still in progress.

For now my media server is in Europe, latency may be mkre for you now… clearly I can deploy media servers all overthe world. Will vet there too once hopefully.

if I were to make a review video of the low latency, could I have a timer on my computer monitor playing and then see the difference in time with the stream?

I’m not planning to test it right now. I’m wondering about how best to show people the benefits of FlowTours. The low latency might be easy to show.

1 Like

Thats how I measured. Low latency will take some time to open to people to try, wondering about latency when someone streams from California…

1 Like

if we show the low latency between california and hungary (or Germany), is there a way people can sign up on a waiting list to get more information on your service?

Are you trying to get inquiries on this contact form?

Or, are you focused on the activity on this Facebook Group?

1 Like

hi, @craig ,
I’m watching both actually. I will create another Facebook group that is for platform and not for the plugin…

Things will move faster now that my plugin is there. Also when streaming toward my platform I will support raw dual fisheye format too. It means even better quality AND much longer oepration time. So in theory if someone will stream from Z1 to flow.tours, should be able to do conference meeting and attendants will be able to see 360 live view with 1-2 second latency if they are on the same continet with the host… Attendants will be able to talk to the host directly via flow.tours. Also some camera could be delivered back occasionally, will see this. And there is ecommerce behind. So in theory host could even “sell” things that visitors can buy, ask questions etc. Clearly the basic workflow is a guided tour in a museum, but we were thinking since the beginning on some car selling companies too as a use case and others. Regarding surveillance like ~1 second latency or less have a 3rd workflow that is not developed yet.
What is interesting that I’m planning to combine 2 approaches: 1. low latency (1-2seconds) and 2. normal latency (~10 seconds) … same camera source would be used and by default majority of people would join normal latency view, BUT if get interested or want to interact they can switch to low latency stream on the fly based on some rules, so there would be a 7-8 second time travel during the event.

1 Like

I’m first doing a recording test.

If the test goes well, I’m interested in progressing to a live streaming test.

9am PT in California is 6pm GMT +2 in Hungary.

initial test of “record only” worked

image

image

works in VLC

I’m uploading it to YouTube now as a test and I am recording another test clip.

30 second test clip works with camera held in my hand and the camera is sideways.

Test with 4GB file.

image

28 minute clip

image

1 Like

Thanks for testing, @craig ! Did it shut down because of heat after 28 minutes? Also it works with H.265, but some users reported black hole visible, so you could skip h.265 for now please.

I also did a record, in dual fisheye and tried to stitch with PTGui and Mistika VR. I’m not too good at stitching…but will upload it to youtube and share here.

The problem with that file is that after I stitch in mistika VR audio is out of sync. If I watch the original dual fisheye it works perfectly… I will contact support of mistikaVR. This is an issue there for sure. Any way I will share here soon.

I turned it off at 28 minutes because I was having problems with my USB-C charger. this is unrelated to the plug-in. I’m trying another test now with H.264. It’s been recording for 40 minutes so far. I think.

It would be nice if you could put a timer on OLED screen of the Z1 to see how long the stream of the recording is. I’m using a stopwatch on my computer to see when to turn off the plug-in.

Is it supposed to record the audio to the video file as well?

I have the Z1 19GB model. The max time per recording is normally 25 minutes. The total video recording is 40 minutes. If I record at 24fps H.264, how long do you think I can record for with the video file saved inside the camera? It’s looking like it may be 8GB an hour? The 30 minute video was 4GB.

image

I’m currently waiting for my 28 minute to process. It’s a quick way to show that there’s a way to take a video that’s longer than 25 minutes.


The Z1 is drawing more than 500mA. It’s fluctuating between .6A and 1.31A. Not all USB ports can supply more than 500mA. I have the Z1 plugged into a wall charger.

Agree and thanks, it makes sense to display some timed data. However I wouldn’t update it too ofter, probably every 1 minute, to count minutes only, to not update too often the ‘screen’.

Yes, audio is also recorded there! When recording in equirectangular 24FPS is maximum I can get now with Z1. In dual fisheye 30FPS also works fine…

Video is still in processing so resolution is still very low at YouTube, but you can look at a bit later: https://www.youtube.com/watch?v=1oJ5ml-LFPs

hi,
did processing of that ~30 minute video finished? I think that h.265 is the key here, to record using h.265 in 20mbps it will look superb and 25FPS is also stable during my internal tests. Also there is a cheaper tool that may help with stabilizing Cyberlink Powerdirector… I didn’t get there yet to upload recorded files to YouTube, I was pushing streaming so far. But as it was easy to make it record to a file I thought why not to try and see later how will it work? I think I may be able even to inject spatial metadata directly, but it may add 1-2% CPU usage too, so 25FPS may not work only 24FPS. Any way for me streaming was most important.

The video you uploaded looks great.

I successfully uploaded the 28 minute video, but it is still processing on YouTube.

I may try and upload it again.

I tried to take a 3hr 16 minute videos, but it put my camera into an unbootable state, showing only a triangle with an exclamation point in the center.

I was able to boot it again with

adb shell
reboot bootloader

I’m not sure what the command actually did, but the Z1 is working again and I’m going to try a 60 minute video recording this time. I’m 50 minutes into the new test.

Craig, it may happen that storage was full and I didn’t tested through that scenario like action “stop recording and close properly the mp4 in case storage is full”? Not sure.User from Taiwan who recorded almost 3 hours didn’t report anything like that, I’m sorry! That file was lost right? I made the plugin to stop recording and close the file when thermal shutdown is initiated, but for storage check I didn’t do anything specifically… 60 minute will be fine. Even 2 hour should be fine. if 30 minute takes 8-9gb. But that 3 hour video I guess you tried at lower bitrate right? 12mbps in h.264?