yes, absolutely! Clearly extra low latency requirement will only be there in local network for now, controlling a drone may work with ~200ms latency, but not when ~2 second latency is in place. 4k resolution is still a limit in this case, both for 360 cams and for delivering to viewers. For better quality ~4 second latency provides better image quality but still some interactivity can be achieved with the presenter.
There are some limitations how 360 video is rendered inside these VR devices , so some tweaking will be required there too, like I did with Z1. Will hopefully get there this year.
It will be weeks and when plugin gets there to plugin store, it may take me more time to settle all those options and not planning to expose all functionalities at once. I’ve control over it. My website generates a JSON configuration data feed and via web UI not all options are exposed to public, going stpe by step. Lot of parameters are there to adjust and no graphical UI for each yes. It will take more time.
Sure glad to share with you directly in first place if it helps your project.
In Meta Quest I’m testing my streams by using inbuilt browser and watching on flow.tours website directly my stream. So not directly, on Meta Quest I’m not aware of any app that could directly ingest and play an incoming RTSP or SRT stream. I was thinking to create one, but I decided to try support via browser.
The project used motionjpeg and the browser inside the VR headset.
I would recommend you write your own Theta V plugin that establishes a WebRTC connection with the base station. WebRTC is a P2P media streaming standard that was designed for real time communication, and uses H.264 (which the Snapdragon 625 has hardware encoding for) for video encoding. It is well documented and is your best bet for low latency video streaming at high resolution and framerate. It is sort of complex and will take some time, but it would be well worth it. Here are some examples of WebRTC.
It was difficult if want to keep high quality, when I last time looked into several years back. I tried using native cpp libraries and functions to use for best performance, but it may not be the only way.
Thanks for sharing, @craig ! Sending local webrtc from z1 would work and previewing it in browser of meta quest and other devices in theory in 4k and h.264 only, without audio, because no proper audio encoder available in theta devices for webrtc. But Im busy with other things now, so not sure when will I be able to get there. Still for quality I aim for it would take more effort and not sure about it at all.
with my new plugin I’m able to stream to desktop VLC directly already, but instead of RTSP I used SRT. I’m not aware of vlc-unity and vlc differences, but I assume it should work toward Unity too with h.265 encoding, wirelessly.
And I can connect to my module directly and pull and watch stream in h.265, h.264, HDR mode, etc. with extra low latency with full 4k quality, I wasn’t able to make VLC play this in 360 mode, I may need a custom build or extension and we are done here.
Now in VLC Unity, I assume if srt_input access module is there, it should work too the same way.
When my HDR Wireless Live Streaming plugin is turned on and running on Z1, connecting via WiFi to the same network as the device running VLC, I start the stream like this in VLC:
Where IP is the IP of Z1 port is the port set on Z1 for SRT. A stream ID may be added there or passphrase too if needed.
This is the preferred way of streaming as I was able to get the maximum quality and lowest latency at the same time, and can finetune in VLC and in camera the latency too.
This feature isn’t there yet in actual plugin version in the store, but I can provide you APK to test if would like to. I finished this recently, doing some more tests and will send to Ricoh to update. Btw. I couldn’t find the web form where to submit plugin update, could you help please?