HDR Wireless Live streaming(Flow Tours) - 4K resolution in YouTube Live

@biviel Based on the video views, it seems like this is interest in your plug-in. Are you getting analytics from the RICOH Plug-in Store on your plugin downloads?

image

Ive a solution for offline useg after an online activation by saving a uniquekey to local system storage of the plugin… like for 30 days or longer. This isn’t there yet in this version. I would love to see people starting to use this plugin and will do decisions accordingly.

Where can I see number of downloads of my plugin? I wasn’t aware of this…thanks!

Did you get an email about the analytics when you joined the partner program?

image

I just checked it today and it still works.

1 Like

Hi, boys.
The official Wireless Live Streaming Plugin just updated to ver.1.2.2.

  • For RICOH THETA X, the new video sizes “4K, 15 fps”, “2K, 15 fps”, and “1K, 15 fps” are now available.
  • Selecting a video size with a lower frame rate will reduce power consumption during video recording.

Notice of RICOH THETA plug-in upgrade

Regards,
Toyo

PS.
X doesn’t need to reduce power consumption. How do you think?

1 Like

The power consumption management is intended to lower heat. at 4K 15fps, my first test with the camera lasted 10 minute before thermal shutdown. I am doing a second test with a small $10 fan power from an external USB.

image

1 Like

This means “X works without battery”.
Why does no battery device need the power consumption management?

I do not understand it at all.
Toyo

The greater the power consumption, the greater the heat. It’s like laptop getting hot when you encode a video. If a device is consuming more power, it will get hotter.

I’ve been streaming at 4K 15fps for 90 minutes. I’m going to stop it now as I’m going home for the day.

1 Like

@biviel member from community asking about shooting HDR video with the Z1. I pointed them to FlowTours. You may want to provide the person with additional information.

Set RICOH THETA to Stitch Video Inside the Camera - YouTube

Another person asking about Wireless Live Streaming to OBS. Maybe he can use your FlowTours plug-in instead?

RICOH THETA Live Streaming Tutorial with OBS - YouTube

1 Like

Thanks, I added comments there!

  1. Recording video

HDR video can be recorded by my plugin: in equirectangular and dual fisheye modes, both h.265 and h.264 encoding works. At first there is a stitching issue, I may come back to it and fine tune at later point, when saving in dual fisheye mode, it can be stitched after in Mistika VR/PTGUi combo, both are needed the first time and also MistikaVR can stabilize the video. I’m planning to demonstrate this workflow once I get there. :slight_smile: it really looks amazing, but MistikaVR price is quite high. There is a monthly subscription option to try.

  1. Live streaming to OBS - this is not possible, also Theta X is not supported yet by my plugin, work in progress. The issue here is that we can publish to an RTMP server, but inside OBS there is no RTMP server running. Actually when using my plugin there is no need for OBS, can stream directly to any platform that supports RTMP.

How does Mistika VR work with PTGui on the videos? I though PTGui was only for still images.

Also, doesn’t your plug-in save the live stream to the camera in equirectangular video format?

Or, are you saying that the HDR video has an imperfect stitch because it is a stream and the in-camera stitcher has some problems with the stitch? Are you saying that you want to improve the stitching to be comparable to what the RICOH desktop app can do to stitch the dual-fisheye video.

hi,
PTGui can be used to create a correcte, close to perfect stitching. Saving that project file and importing it into Mistika VR is the key here.

Mistika VR is able to stitch good the video that I record in my plugin in dual fisheye mode AND also adds stabilization, also it supports h.265 mp4 that comes from my plugin.

MY plugin can save in 24FPS in equirectangular mode, but the video looks exactly like during live stream. Clearly, can set video bitrate to 50mbps in h.265 it looks quite good. BUT stitching in camera has issues, also ricohs algorithm is used by me when rendering to equirectangular, but in HDR mode, there was bigger stitching issue, so I did some opengl operations to modify every frame a bit, but it’s still not soo good. I was thinking to implement a custom stitching mechanism, but still 24FPS may not be enough to some users.

Another option could be to get from RICOH some details about what is required to store in mp4 to be able to stitch it with Theta desktop app directly. I mean to stitch the mp4 that is recorded by my plugin.

I did some investigation and in theory I could store metadata on the fly, but not sure how would that affect FPS… Also I couls save even gyroscop and accelerator data, but not sure about the format used by Theta Desktop app. btw, does it use gyro and accell data? I think yes, like it does some kind of stabilization, isn’t it?

Clearly with my plugin I’m able to save record in HDR mode, and Ricohs internal recording can’t do that, it’s a huge quality improvement, especially indoor, but outdoor too when it’s not so sunny day. Also I’m able to record for an hour even or more, especially if h.265 is used as encoding.

The problem with Mistika VR is that it cost 60-80USD for a month and process is also a bit complex at the first time. I tried PTGui and was able to get quite good results…

The format is not disclosed. It does not appear to be CAMM format.

The desktop app has a feature called Top/Bottom correction which is video stabilization. The data it uses to do this is not disclosed.

Yes, I thought so… It’s a bit said as it could be really the best cam even for recorded videos. And I just found recently a potential option to record high res videos, going beyond 4k may also work, clearly with some lower FPS, BUT if recording in dual fisheye, it could work even in 25FPS or 30FPS in 5.7k. Once I’ve some time I will give it a try.

1 Like

hi @craig ,
good news! I was able to optimize my own broadcasting system behind flow.tours… With Theta Z1 I can deliver ~1 second latency live streaming, also at the same time to transcode to multiple different devices to watch. ~1 second latency on desktop , ~1 second latency on mobiles at the same time. Still I need to render it to camera as it’s 2d for now, but that will not have effect on latency.

This latency was measured by me when I streamed to a destination server in Germany, which is ~700 miles from camera location and also live stream I was watching, delivering to Hungary. So “stream” travelled ~1400miles end to end.

Actual HDR Wireless Live Streaming plugin version in store could do this with my platform behind flow.tours. I’m building now on top of this 2 way communication. And yes, it’s important to highlight that quality is still high, higher even than those that I streamed to YouTube… but isntead of 25-30 second latency its down to 1second now. Clearly, to view need higher and stable network, I’m still optimizing for watching on mobile devices via 4g or 5g mobile internet networks, so in some cases this 1 second can be 2 seconds…

I really push this so that EVERYONE would be able to try ASAP.

Laszlo

Laszlo

1 Like

This is great news. I’ve only used HDR Wireless Live Streaming with YouTube.

Is FlowTours back-end system open for testing?

I can grant you acces to try and see real quality of it. There are multiple workflows and optimisations.

Use case 1 stream to flow .tours and deliver to huge number of viewers by aws cloudfront cdn, this could start testing. I can grant you acces if would like to test. Here latency is 10 seconds but quality on both desktop and mobile is better than YT, can consume h.265 stream directly. Btw this can ingest AND deliver 8k directly too. Clearly 8k its not for z1, but viewing a 360 8k stream is a must seen thing and it works on majority of desktops when chrome browser is used.

Use case 2 stream to flow.tours with low latency, limited nummer of viewers (10-50, not sure yet), but latency is 1-2 settings depending on devices used to watch… work is still in progress.

For now my media server is in Europe, latency may be mkre for you now… clearly I can deploy media servers all overthe world. Will vet there too once hopefully.

if I were to make a review video of the low latency, could I have a timer on my computer monitor playing and then see the difference in time with the stream?

I’m not planning to test it right now. I’m wondering about how best to show people the benefits of FlowTours. The low latency might be easy to show.

1 Like

Thats how I measured. Low latency will take some time to open to people to try, wondering about latency when someone streams from California…

1 Like

if we show the low latency between california and hungary (or Germany), is there a way people can sign up on a waiting list to get more information on your service?

Are you trying to get inquiries on this contact form?

Or, are you focused on the activity on this Facebook Group?

1 Like

hi, @craig ,
I’m watching both actually. I will create another Facebook group that is for platform and not for the plugin…

Things will move faster now that my plugin is there. Also when streaming toward my platform I will support raw dual fisheye format too. It means even better quality AND much longer oepration time. So in theory if someone will stream from Z1 to flow.tours, should be able to do conference meeting and attendants will be able to see 360 live view with 1-2 second latency if they are on the same continet with the host… Attendants will be able to talk to the host directly via flow.tours. Also some camera could be delivered back occasionally, will see this. And there is ecommerce behind. So in theory host could even “sell” things that visitors can buy, ask questions etc. Clearly the basic workflow is a guided tour in a museum, but we were thinking since the beginning on some car selling companies too as a use case and others. Regarding surveillance like ~1 second latency or less have a 3rd workflow that is not developed yet.
What is interesting that I’m planning to combine 2 approaches: 1. low latency (1-2seconds) and 2. normal latency (~10 seconds) … same camera source would be used and by default majority of people would join normal latency view, BUT if get interested or want to interact they can switch to low latency stream on the fly based on some rules, so there would be a 7-8 second time travel during the event.

1 Like