Join us on November 16, 2pm PT for a live streaming event in 360!. We will use Wi-Fi streaming from the RICOH THETA Z1 and the FlowTours HDR Wireless Live Streaming Plug-in. Our special guest Laszlo Vargas, CEO of FlowTours will be able to answer questions live via chat.
Is it more popular to use humans or robots to manage the live stream?
How can I apply AI processing to the live stream?
Can I use Tensorflow?
Can I use OpenCV?
Can I use VSLAM on the live stream?
What’s the power requirement for long-term streaming?
What’s the meaning of the thermometer warning icon on the OLED of the Z1?
Can I stabilize the live stream when I’m walking?
Can I access the orientation sensor data of the camera while I am live streaming?
Can I adjust exposure compensation during the live stream when I am moving between light and dark areas such as a bright music stage to the audience?
Can I use H.265 compression for streaming?
Does YouTube support H.265?
Is there a noticeable benefit in using H.265 versus H.264?
What bitrate should I use in different streaming conditions?
What framerate should I use? Is there a difference between 24fps and 30fps from an artistic perspective?
Will lowering the framerate to 24fps actually help stuck frames or frame resolution in actual practice?
What are keyframes in the video live stream?
Can I stream in dual-fisheye?
Can I connect the RICOH THETA to a Raspberry Pi and then use the Raspberry Pi to relay the stream over the network?
Can I power the RICOH THETA Z1 indefinitely from a Jetson Nano or Raspberry Pi?
this live event is broadcast to YouTube using FlowTours and their HDR Wireless Live Streaming Plug-in. The RICOH THETA Z1 will be connected with Wi-Fi to the Internet using client mode. We will stream in HDR equirectangular at 24fps, 16Mbps bitrate
HI, @craig ,
yes, I checked your configuration on flow.tours and adjusted/reverted. It would be good if you could do a test stream at the exact location with actual settings. I set video bitrate to 16mbps, need to see there if WiFi is good there, also if Z1 is set to 2.4ghz WiFi mode, the signal is available on longer distances too, so signal may be stronger there. 5ghz WiFi is faster, but it’s range is less vs 2.4ghz WiFi networks, that’s my conclusion from tests I did. But not sure how is the network setup there for you.
yes, the same for me when I stream from here to YouTube. I’m thinking to potentially ask you to stream to my platform, as incoming stream was nice and stable and on flow.tours to reduce latency. I did this before and when my platform was used as relay this latency was reduced toward YouTube… strange.
@biviel , the flow.tours platform is really nice. However, I’m still thinking we’re better off with the YouTube platform for the event despite the technical limitations of YouTube Live. The latency and lack of 2-way video conferencing ability on YouTube Live is a problem, but YouTube does archival and promotion features for live events. I believe the live event archive is included in the YouTube recommendation engine.
BTW, you mentioned that your system can do 8K with a non-RICOH camera. Are you compressing the 8K video in H.265 inside the camera? I thought there were some problems with H.264 and H.265 hardware compression above 4K video?
Does you system support spatial audio to allow audio orientation of a scene?
What about stereo audio for a live music event?
Does your HDR Wireless Live Streaming plug-in adjust between light and dark changes? For example, if I walk from an underground parking garage to bright sunlight, will I be able to see good views in both locations?
Yes, I agree to stream to YouTube, I was only suggesting to stream toward flow.tours using h.265 encoding SRT and from there to restream the same to YouTube Live, on YouTube end it would look like you would stream directly to it, but you could leverage h.265 encoding set to 16mbps and from flow.tour could stream 20 or 25mbps toward YouTube. This provided the best possible quality during my tests. But this may not be necessary for you as you tested and your bandwidth was fine there at the location. By default I’m planning to support this “restreaming” toward YouTube from flow.tours, but of course there latency and quality would be different and no communication/audio backwards is possible. in this scenario flow.tours could be a video mixer even at the end. Simple restreaming is there already.
Yes, h.265 is used 8k video is produced by camera directly. Both h.264 and h.265 can work beyond 4k, depends on hardware and profile levels as I know.
Spatial audio is not supported yet, but stereo is there, simple 2 channels are there.
Yes, this was important. During live stream or recording it’s adjusting automatically exposure in such cases. But there is also a manual adjustment possible in plugin, using “exposure compensation” (-10…+10) to change on flow.tours configuration for Theta Z1. If you know that environment will be mostly in dark or artificial lighting will be there, better to set exposure compensation to -5 or -8. Still this is only advised if environment is mostly dark. Like when I was talking with you outside, next to our basement.
For High bandwidth - low latency requirements, Odience enables carriers with a virtualized service deployable within a carrier’s MEC platform enabling 8K tiled 360 degree live video streams which consume up to 80% less bandwidth compared to standard streams, with the added benefit of 5G MEC platform operating within the carrier’s network, RCS/IMS services (from advanced communication to chatbots) can easily be integrated and overlaid within the MEC enabled 360 live streams.
The RICOH THETA is limited to 4K live streaming. However, other cameras that are more expensive can stream 8K. This would likely take a lot of bandwidth, perhaps more than the average person has available.
We’ll discuss this future trend in the event.
My personal opinion is that most of the audience are still struggling to get a reasonable stream at 4K. I don’t think most the viewing audience can watch an 8K stream.
Just to clarify. Indeed 8k 360 streaming requires high bandwidth upstream from the camera, but Odience manages to reduce the streamed bandwidth requirements downstream to audiences at very reasonable levels, by only sending the video data needed for each user’s unique field of view and zoom level.
The result is bandwidth levels far lower than Youtube 4k 360 and at far lower latency. We are talking 8Mbps and approx 1 second glass to glass latency for 8k 360, as opposed to 20-30Mbps and 20-30s latency for 4k 360 streamed via YouTube.
We use this to enable real time interactive experiences for audiences eg for shopping and entertainment use cases.
could you share please, which camera was used at those samples available through Odience app for mobile?
Those samples are still available to re-stream for the public if someone is interested to check can download the app and register and watch, right? Asking because when I checked a few weeks back the sample “RUDSAK”, I couldn’t watch for some reason.
We use KanDao Obsidian R and Insta360 Pro 2 cameras for professional 360 content streaming. The Odience app is free on AppStore and PlayStore and has 6 or 7 publically available restream events. This content was originally live streamed but is now available as restreams.
Simultaneous viewers of a live or restream event receive the event content at the same time, and they can also chat or video call each other to discuss or watch the content together within the app.