viewing theta livestream in VR headset locally

I want to be able to put on on my quest 2, move my head, and look around and view in 360 degrees the feed from a 360 degree camera in another room.

my current setup is:

Ricoh Theta V camera

usb cable to windows 10 PC

I can add the camera feed as a video capture device in OBS. I can then set the recording type to be to a URL, where I use udp and record to a local website. Then view the feed in VLC. Then I can use virtual desktop on the quest 2 to view a rectangular feed of the camera.

this is where I am stuck. I cannot turn the image into a sphere to be viewed properly with the headset. I’m not sure what set of programs would be best to do this. I need to also not involve the internet at all, only local hardware/programs and networks. No beaming to youtube or facebook first.

will my next step be learning unity? that seems to be the consensus of what I’m finding, using unity to project the camera as a webcam onto a hollow sphere, and set the camera angle to be inside the sphere. how do I view the stream in unity once I get it going?

thanks for any help you can give me, I’ve had a silly idea for weeks that I’m stuck on.

Test the community version of Unity.

The current technique is to use a Skybox.

My older tutorial used an inverted sphere.

This is 4 years old. It may no longer work. Also, the flip-normals sphere is not the best practice now. As the recommended approach is to use a Skybox as shown above. However, if you want to just get things going, the inverted sphere may work.

The camera should appear as a webcam inside of Unity.

This likely no longer works, but it help as a reference.

Post again if you get stuck or have more questions.

thank you very much. I am going to mess with it for a few hours, I’ll hopefully be back tonight or tomorrow. I am very impressed by the atmosphere here, I feel like big things are happening around me on this forum.


The activity around live streaming is still going strong. Latency may be an issue, depending on your application. You’re likely to see around 300ms from the camera to computer over the USB cable. There is going to be more latency than compared to a normal webcam, like a logitech webcam. You also have to adjust for latency to your viewer if you want to transmit it over the Internet.

People are implementing solutions in places where it is dangerous or expensive for humans to get to. Training and entertainment are also building up momentum.

good luck.

This is an example with a plant from 5 years ago.

I am back quicker than expected! I achieved some degree of success already.

this guide/program of yours worked with the oculus. Followed the guide exactly and had to update this:

but then, I do now have a live feed of the camera! directly in the headset! Eureka!

however, the feed is incredibly choppy, I estimate its probably 15 fps. The quality/resolution is however perfectly adequate for my use case. When I turn my head, I see black for a moment until the feed catches up. Its like, rather than having the entire sphere rendered, it only has what I’m looking at rendered. So when I turn my head, it has to catch up until the feed shows. Is this a limitation I can expect in your experience? Obviously this is just a small test program you made as an experiment, but do you have any idea on how I can improve it?

again, thanks an absolute million. I’ve spent a decent bit of money already trying to get this to work, I should have came here first!

The Meta Quest 2 was not out when we built the test demo. If you update Unity and SteamVR and rebuild the application, it may just work as expected.

When we tested it with HTC Vive long ago, the feed was smooth at 30fps in the headset. Back then, the headset was connected to the computer with a physical cable.

If updating SteamVR and Unity do not work, you can try replacing the inverted sphere with a skybox.

Unfortunately, I don’t have a Meta Quest 2. However, other people on this forum may have one.

hello again friend. Once again I have found success! My problem was too much CPU usage elsewhere. I had a video game I was waiting to get back to running in the background eating up my CPU lol. After clearing that (plus a few more anti-stuttering fixes unique to quest 2) your program gives me a pretty stable feed of around 20 fps I’d guess. Much better than last night. I have an overclocked i5 processor from maybe 4 years ago that your program is eating 75% of! I have a feeling I’d get full 30 FPS with a better PC. This at least allows me to prototype the rest of the hardware needs for my specific use case. I also got the system working with a 100ft usb cable extender, which is very interesting :wink:


Wow. Amazing that it still works. Thanks for reporting back that it works with the Meta Quest 2. Congratulations on getting it to work with a 100ft usb cable extender. I woudn’t think it would be possible.

This post is so great! Congratulations on sticking with it and getting it going! 100ft USB cable is wild! I actually used it with about 20’ cable, and I thought that was crazy: DeveloperWeek NYC (June 19-20) - #10 by jcasman

Great work!