I’ve been using OSC 2.1 to get a live-preview stream using Python on OSX. That works. Now I want to display it via OpenCV also using python on OSX. In particular I’d like to use the cv2.VideoCapture(url) call. However this seems to require a URL that supports HTTP GET, whereas OSC requires HTTP POST.
Has anyone got the cv2.VideoCapture api to work with OSC with the Theta V?
Another article on this forum indicates that “THETA UVC Blender” is required. Can someone explain a) if this is relevant to using VideoCapture, and 2) exactly what “THETA UVC Blender” is and why it is required, and what is used on non-osx platforms.
Note that in both examples, there is some type of relay to make the MotionJPEG stream from camera.getLivePreview usable by a web browser.
You might be able to point cv2.VideoCapture(url) at the FOX SEWER ROVER stream as a starting point for a test. If it works, you can then try and use an internal server on the THETA using plug-in technology.
THETA UVC Blender
This is a driver for Windows. You only need this is the THETA V is connected to Windows with a USB cable. If you’re trying to eliminate a Windows machine, you don’t need this.
Can you use the THETA with a USB cable connected to the machine that your OpenCV application is running on?
My goal is to stream video wirelessly from the Theta V for an art installation. Using the tips above and from elsewhere, I successfully was able to write a python web server to stream video from the Theta to the HTML5 <video> tag. The code is here: https://github.com/rwoodley/OSCVideoStreamExample
I do not have the bandwidth on my Internet feed to test it. It may be easier with 2K.
This is using RTMP.
There is latency if you stream it from YouTube.
I’ve seen the plug-in below demoed and the resolution and uptime was good. I don’t actually know what service is needed from HoloBuilder to use it. You should try and contact them. Maybe they’ll just let you use it for the art exhibit.
Thank you. Re WebRTC, I presume you’re referring to your article on video conferencing where you have a Theta connect via USB to one laptop, then broadcasting to another laptop over WebRTC. Is that what you had in mind? I guess that is feasible, but it requires an extra computer and since I want to use OpenCV, I’d have to parse the WebRTC in python or C++.
I did install the plug-in and was able to connect to my Theta V over my router, which is nice. But the streaming seems hard-wired to use YouTube/Facebook etc. I want to capture the 30FPS stream myself, wirelessly. Is there any alternative to writing my own plug-in at this point? Has anyone cracked that nut already?
Is there a way to do this with ffmpeg? Can it capture the theta plugin wireless stream?
At the current time, no one has published a project on this site that provides the functionality you want. The current plug-in is hardwired to use a server such as YouTube/Facebook. The video will need to go out on the Internet.
I don’t think there’s an easy fix right now.
It might be possible to find an existing Android app that can stream the phone camera to motionjpeg. The camera on the THETA will appear like a phone camera to the internal Android OS.
Hi @jcasman -
The Art Exhibit went well. I had to use a USB connection to a laptop to a projector to get the interactivity I wanted. Which meant I couldn’t just leave it running when I wasn’t there.
So I’m still looking for 2 things:
a wireless camera stream from the Ricoh with no need for internet.
a Raspberry Pi substitute that can display the feed. (Which means support for WebGL).
I think I have found 2) with the new NVIDEA JetsonNano.
So 1) is the hard one. I have downloaded the plug-in code and signed up for a developer account, but need to find time to work on this.
I meant to respond more quickly. This is really great looking! I see pretty clearly what I think you’re doing in the first two pictures (live streaming from the middle of the room and projecting it on the wall in equirectangular format) but what is the third picture showing? Are you taking a live stream and filtering it so it outputs as almost a line drawing?
How was the reception to it? Did it do what you wanted? Are you planning more?
Is OpenCV applied to RicMoviePreview? Does it then take a picture or extract the frame from the movie preview? I don’t understand how OpenCV works with a video feed and whether it is processing each frame as an image and able to do this processing at 30fps.