I need some guidance with this. My main goal is to connect the Theta S to a Raspberry PI and stream video to an Android App (made by myself).
I already mounted two different streaming servers in the RPI,
mjpg-streamer : Almost zero delay, but streams in dual fisheye mode
ffserver: Streams in equirectangular stitching using a remap filter of ffmpeg. But the delay is about 40 seconds.
I know that there is a UVC Blender for Windows which is pretty decent when the Theta S is connected through USB, so my question is this
Can I, from Windows, receive the video streaming from the RPI in dual fisheye mode and stitching it somehow and re-stream it again to my Android App?
I would really appreciate if someone could point me in the right direction here. Maybe someone thinks that there is better option than put a Windows in the middle, I’d glad to hear it.
I don’t really care about video quality loss (at least at this point) if it works.
Something that I forgot to explain, my RPI won’t have internet access.
I had came across that github project, but I discarded it because it uses internet to access to the Cloud API… Am I right?
Do you know if that project could work totally offline?