Here’s a bit more information from interviewing the developer, Megan Zimmerman.
Q: Is it possible to get the SteamVR assets in the Unity app to work with another headset?
A: For this solution, since I was working with the Vive, I included the prefab that works with the Vive. SteamVR works with Oculus as well. There is a little bit of external setup to enable unknown sources for the Oculus. Additionally, you could use the Oculus utilities for Unity (at this link https://developer3.oculus.com/downloads/) instead of SteamVR.
Q: Do you have advice for people streaming from HDMI? What’s your setup?
A: I have used both, currently I am using the HDMI output with a Feelworld HDMI to USB 3.0 Capture card which works great.
Q: Do you intend to put your project up on GitHub or other public repo?
A: I plan to publish it through GitHub. I’m in the process of filling out some paperwork that the government agency I work for requires. I’ll let the community know when it goes up.
Provides a path forward for developers that want to get the THETA live stream inside of Unity since UVC Blender and UVC FullHD Blender do not show up in Unity. (If someone gets it to show up, tell us how)
Provides a possible solution to developers that want to use the HDMI output (which is in dual-fisheye)
First open source example of using SteamVR components to stream to headsets, with a HTC Vive example and a path for Oculus
Photosphere was the first solid real-world use the THETA USB API. Great source code, documentation and parts list.
tlapser360 took the WiFi and USB API and added external sensors like GPS and light to adjust the camera in addition to taking pictures. Full source code and documentation available.
This is my first time using SteamVR. I’m on Windows 10, Unity 5.5.0f3, 64bit, personal. I do not have a headset. I want to test the tutorial on my desktop.
I get the error below:
VR: OpenVR Error! OpenVR failed initialization with error code VRInitError_Init_PathRegistryNotFound: "Installation path could not be located (110)"!
Blank project looks more promising than my previous attempt this morning. Will leave my previous post up as it may help other people. Going through rest of tutorial now.
The section below was the critical insight for me. Perhaps other people were struggling with this in the previous blog post as well?
You’ll need to now make a script for projecting the camera feed onto the two spheres, this script is based off of the script from the second post I mentioned. Though, rather than taking an equirectangular form onto one sphere, you need to project each lens’s feed onto their respective spheres (because as of this post, Unity does not recognize the UVC blender as a webcam).
I’m kind of wondering how I missed that from the older post by Hecomi.
Now here’s the question. Do I buy a Vive for $800?
If anyone bites the bullet and buys the Vive, it looks pretty easy to develop basic applications for it. See this tutorial below that covers the controllers that were disabled in the tutorial above.
That guy has a bunch of tutorials, including this for interacting with objects.
Although it looks cool and fun, not sure if I have time to get my $800 worth of fun to justify the Vive especially since I don’t play PC games
The dual projection is implied in Hecomi’s post as he adds the camera feed script to both spheres for their equirectangular solution. In the Tanyuan solution, you have equirectangular already and are projecting one feed onto one sphere, because we are using two spheres we project onto both.
Personally, when I started working on this solution I got caught up in Hecomi’s solution because its harder to understand what is going on if you don’t do it yourself step by step. This has me thinking that I should probably make my own shader from scratch, because while I get the idea of what the shader is doing, I want to understand how.
Thanks for adding the fix about steamVR I forgot to mention exactly how setup works because my Unity sets itself up automatically for VR. Whoops!
Great news! Hey, I’d be interested in learning what you’re doing with live streaming.
Also, do you know that you can get UVC FullHD Blender working in Unity as well with the hack below? People are doing some cool things with Unity and the THETA. Always exciting to learn about new projects.
However, now I have a new problem. The Theta V does not display anything. I know that the code works because I am able to use the webcam in the computer, however, whenever I use the Theta V, I just see the default scene.
On another note, I made a script that prints the device names in the console, and the Theta V shows up three times as different webcams. (1) Theta V, (2) Theta V FullHd, and (3) Theta V 4K. Is this normal?
Hello everyone! I have recently switched form theta v to theta s, and now it works! I was wondering, does the wifi feature on the theta s only work with the app? Or am I able to use the wifi camera preview with the Unity app that we have created above.