As far as I dug into it, you could livestream video AND audio by:
- USB streaming to OBS (and then wherever you please but this is not considered wireless)
- USB streaming inside Unity (with webcam and microphone devices)
- Wireless Live Streaming Plug-In (directly to Youtube, Facebook)
- THETA RTSP Streaming Plug-In (catching stream on VLC, GoPro VR Player, Pot player)
- Device WebAPI Plug-In (with the provided RTSP url, again catching the stream as above)
- Infomorph Live (with WebRTC protocol and visiting your’s THETA specific HTTPS url stream)
The above solutions for RTSP and WebRTC have not been tested by myself, because of some errors and problems that I will refer to a next post.
Brief description of my project:
My colleague and I work on telepresence with Ricoh THETA V and real-time control of a UGV. He has the UGV part so I can not answer questions about this. My part is the real-time video and audio streaming of the camera, projecting inside a VR headset (HTC Vive).
My reasearch so far has come to solutions as this drone project with its amazing web VR viewer by @Jake_Kenin, this amazing project for RaspberryPi or this project using Python both by @Hugues and this Unity Project updated by @KEI.
All of the above, except the 1st project for Hugues which also has no audio, are using the THETA Web API utilizing the camera._getLivePreview which works with JPEG frames so if I am not mistaken, no audio is provided in the streaming.
So, considering all the APIs (CameraAPI, WebAPI, USB API) Ricoh is providing, my final question is:
How do I get audio streaming side by side with the camera._getLivePreview method, or any method with low latency?
P.S.: Sorry for the long post, but consider this a guide for all the research I have done so far.