Opencv VideoCapture

I’ve been using OSC 2.1 to get a live-preview stream using Python on OSX. That works. Now I want to display it via OpenCV also using python on OSX. In particular I’d like to use the cv2.VideoCapture(url) call. However this seems to require a URL that supports HTTP GET, whereas OSC requires HTTP POST.

Has anyone got the cv2.VideoCapture api to work with OSC with the Theta V?

Another article on this forum indicates that “THETA UVC Blender” is required. Can someone explain a) if this is relevant to using VideoCapture, and 2) exactly what “THETA UVC Blender” is and why it is required, and what is used on non-osx platforms.

Thank you!

1 Like

Thank you for reporting on this interesting project.

The livePreview stream is MotionJPEG

This is a working example of a plug-in to do the livePreview from Line 534.

As the web server in the plug-in provides the livePreview, you might be able to use the plug-in (it’s in the store), then inspect the HTML see to embed it into the web page.

The FOX SEWER rover code might also be a good starting point.

Note that in both examples, there is some type of relay to make the MotionJPEG stream from camera.getLivePreview usable by a web browser.

You might be able to point cv2.VideoCapture(url) at the FOX SEWER ROVER stream as a starting point for a test. If it works, you can then try and use an internal server on the THETA using plug-in technology.


This is a driver for Windows. You only need this is the THETA V is connected to Windows with a USB cable. If you’re trying to eliminate a Windows machine, you don’t need this.

Can you use the THETA with a USB cable connected to the machine that your OpenCV application is running on?

Or, do you need the connection to be WiFi?

Hi again. Thanks for those links. Very helpful.

My goal is to stream video wirelessly from the Theta V for an art installation. Using the tips above and from elsewhere, I successfully was able to write a python web server to stream video from the Theta to the HTML5 <video> tag. The code is here:

However using livePreview limits you to 8 fps, at least if you want decent resolution:

I guess the next step is to use the wireless live streaming plug-in? ( Can that stream at a higher rate than the livePreview? If so I wonder why livePreview doesn’t support that out of the box.

Again, my goal is to wirelessly stream HD at at least 30 FPS. Has anyone got that to work?

So just wanted to check that before moving forward. Thanks!


I think that the plug-in in the store does wireless live streaming at 4K 30FPS (theoretical maximum).

I do not have the bandwidth on my Internet feed to test it. It may be easier with 2K.


This is using RTMP.

There is latency if you stream it from YouTube.

I’ve seen the plug-in below demoed and the resolution and uptime was good. I don’t actually know what service is needed from HoloBuilder to use it. You should try and contact them. Maybe they’ll just let you use it for the art exhibit.

There is no need to go over the internet I don’t think. I plan to do this all on my LAN.

I believe webrtc traffic is going peer to peer. The solution below requires an external signalling server. I don’t think the actual video traffic goes over the Internet.

Note: the info below is a completely different strategy and probably not relevant to Bob’s project

About SiteStream

Just in case anyone wants to put it on the Internet.

Thank you. Re WebRTC, I presume you’re referring to your article on video conferencing where you have a Theta connect via USB to one laptop, then broadcasting to another laptop over WebRTC. Is that what you had in mind? I guess that is feasible, but it requires an extra computer and since I want to use OpenCV, I’d have to parse the WebRTC in python or C++.

I did install the plug-in and was able to connect to my Theta V over my router, which is nice. But the streaming seems hard-wired to use YouTube/Facebook etc. I want to capture the 30FPS stream myself, wirelessly. Is there any alternative to writing my own plug-in at this point? Has anyone cracked that nut already?

Is there a way to do this with ffmpeg? Can it capture the theta plugin wireless stream?

Thanks again.

At the current time, no one has published a project on this site that provides the functionality you want. The current plug-in is hardwired to use a server such as YouTube/Facebook. The video will need to go out on the Internet.

I don’t think there’s an easy fix right now.

It might be possible to find an existing Android app that can stream the phone camera to motionjpeg. The camera on the THETA will appear like a phone camera to the internal Android OS.

I have not tested this code below.

This one also looks interesting.


How did the art exhibit go? Were you able to get the local live streaming set up?

Hi @jcasman -
The Art Exhibit went well. I had to use a USB connection to a laptop to a projector to get the interactivity I wanted. Which meant I couldn’t just leave it running when I wasn’t there.

So I’m still looking for 2 things:

  1. a wireless camera stream from the Ricoh with no need for internet.
  2. a Raspberry Pi substitute that can display the feed. (Which means support for WebGL).
    I think I have found 2) with the new NVIDEA JetsonNano.
    So 1) is the hard one. I have downloaded the plug-in code and signed up for a developer account, but need to find time to work on this.

Here are a couple of pics of the gallery:


1 Like


I meant to respond more quickly. This is really great looking! I see pretty clearly what I think you’re doing in the first two pictures (live streaming from the middle of the room and projecting it on the wall in equirectangular format) but what is the third picture showing? Are you taking a live stream and filtering it so it outputs as almost a line drawing?

How was the reception to it? Did it do what you wanted? Are you planning more?


This article in Japanese and the GitHub repo might be useful to you. @jcasman might also find it useful for his discussions at CVPR

1 Like

@codetricity Thanks for this suggestion. I’ve actually been looking at this article recently!

Is anyone headed to CVPR (June 18-20) in Long Beach, CA? I’ll be helping at the RICOH booth (#360) doing demos running some TensorFlow functionality inside a THETA V (4 different demos, Detect, Stylize, Speech, Classify) and the new HDR2EXR plug-in created by community member @Kasper.


Could you briefly explain what the plug-in does?

Is OpenCV applied to RicMoviePreview? Does it then take a picture or extract the frame from the movie preview? I don’t understand how OpenCV works with a video feed and whether it is processing each frame as an image and able to do this processing at 30fps.

compiled apk available here. Tested in THETA V with Vysor.


1 Like