RTSP Plugin -- dual video streams?

Has anyone looked into the possibility of exposing separate RTMP streams for each cameras, instead of a single stream that stitches both, or the dual-fisheye image? It seems like it would be a good way to get the raw data out of the camera in the best possible quality, and then allow either real-time stitching or offline post-processing later.

Just wondering if someone else has looked and found it to be impossible, and/or seen a path to do it… not worth spinning cycles on it if it’s known to be something that won’t work!

11 posts were split to a new topic: Live Streaming - Discover the Sea ROV - Wi-Fi from THETA V

hi,
the live streaming plugin I develop is able to stream in dual fisheye format. At the moment both lenses fisheye on same frame.

It can stream stitched or unstitched. I also built a web application that is doing stitching if required. LEt me know if you have any more questions. btw plugin can stream in RTMP, RTSP and SRT format too.

1 Like

Can i test the plug-in using RTSP streaming on my local network to a viewer such as VLC and display it in equirectangular and dual-fisheye?

What are the configuration steps I need to do for this?

I have limited upstream bandwidth and would like to test it on a local network.

hi, the issue is that I use rtsp publish to server url from plugin, so there must be a server capable to ingest rtsp, like rtmp… so it’s not a pull by server, but rather a push, like in case on rtmp.

A server could be used like highlighted here: dominoc925: How to quickly publish an RTSP stream from a webcam on Ubuntu, tool has windows distributable file too, however I did not use it, also if you run it , should connect z1 and PC to same wifi router and make sure that on windows all ports are open (or ubuntu) for UDP or at least required ones.

if server is up, you could set RTMP server and key through flow.tours, by using IP numbers instead of localhost and follow steps of that article. This is only in theory… of course ant media server sohuld also work locally and try it with RTMP, I used it for testing in past. Of course its missing vr 360 page view for sure…

Regards,
Laszlo

Thanks for the link to that article. i can understand how the RTSP server is set up from a webcam on /dev/video*. However, I’m not sure how the video stream gets onto the Ubuntu box when using the plug-in.

If I specify the IP address of my local fake IP network on the configuration for the RTMP server on FlowTours, it seems like I need to run Ant Media server community edition on my local network to receive the RTMP stream?

How is the RTSP server (using rtsp-simple-server) running on Ubuntu connected to the Ant Media server and why do I need it. Can’t the Ant Media Server function as an RTSP server that I connect to with a client like VLC? Or, maybe it can’t.

BTW, I met with another company yesterday that was interested in live streaming from a robot for industrial purposes.

I was thinking that your plug-in might be usable over Ethernet as there may be wifi reflection problems where the robot will be used. i’m assuming that we’ll be able to get Ethernet working again in the future.

If you proceed with Ant media server you don’t need anything else, probably that’s the easiest to proceed with. But you will have to user RTMP to publish to it, that works just fine. On flow.tours nothing happens, you just define the destination server and keys to stream to, which can be any url, in this case if you setup on local network ant media on pc with ip 192.168.0.100 , you just use it to push rtmp to rtmp://192.168.0.100:1935 and key should be set according to ant media server, probably any stream key could be allowed, so mystream too. so full url would be rtmp://192.168.0.100:1935/mystream that you would need to set on flow.tours. OF course for this to work in local wifi network you have to make sure that Z1 and that pc running ant media is connected to same wifi network… I think Ant Media provides by itself an inbuilt html player too, of course its not 360, but it should work for testing.

I can make it work through ethernet, but device needs to be in developer mode and through vysor have to manually grant permission to allow VPN connection when connecting the first time. but it’s necessary only once. not sure if that will work for them? Still with WiFi is really simple and could use a raspberry pi for gateway, to be used as wireless router close to z1. not sure about what you mean by wifi reflection?

1 Like

I wonder how will this process work on Theta X, with screen, is it possible that if plugin asks for permission, user would be able to hit ALLOW button on that screen, or that screen will not be able to show system level notifications? Did you put Theta X to developer mode already?
Do you know when is Ricoh update their camera API, plugin SDK to cover Theta X?

Thanks!