Detecting the theta s as a webcam texture on ubuntu 16.04

Hi all,

My friends and I have been working on a project which involves sending a ricoh theta s out into the field physically attached to a Mars rover analogue. The theta s is connected to raspberry pi on the rover and then the usb serial is communicated through a wifi connection. Our problem is occurring at the basestation. Our basestation is a linux laptop which is running all of the control code in various terminals. We want to attach the theta s live stream to a webcam texture mapped to a sphere in unity. The base station will then have a livestream from the rover that can be viewed in a unity application. We have demonstrated that the unity application works as intended on a 64bit Windows PC but despite all of our efforts and debugging we simply cannot get unity to attach the theta s to a webcam texture. We have debugged it by demonstrating that the inbuilt webcam and another usb webcam can be used with our unity application. The problem is thus isolated to the theta s.

When the unity application views the stream we don’t even get a black screen. We get a pure green screen with a bit of static interference on the edges. I am very confused as to where to go from now.

Can anybody help?

Hi,I’m trying to understand the application. Do you have it working on Windows 10 64bit, but not on Linux?

If so, be aware that the Linux kernel does not fully support UVC 1.5. This means that the Linux stream is using UVC 1.1 MotionJPEG 1280x720 @ 15fps.

With UVC 1.5 support on Windows, you’ll get 1920x1080 @ 30fps with firmware 01.82 or higher, H.264

Can you display the dual-fisheye live stream on Liinux?

I did use the stream on Linux a while ago, but not using Unity.

There’s some additional information here.

This is primarily to show you that live streaming does work with Linux. I haven’t tried it with Unity on Linux, which is what you need.

I know that people are using live streaming with Unity on Android, so it’s likely the kernel does support it.