How I achieved live feed visualization from Theta Z1 in Unity using two different methods:
First Method - USB:
- Installed the UVC Driver.
- Connected the camera to the PC via USB.
- Set up the camera source in OBS as Video Capture Device.
- To pass the live feed to Unity, I utilized the virtual camera feature in OBS, as described in my initial post.
- Finally, the image is projected onto a sphere with inverted normals (I used an Unlit texture for the appropriate coloration and brightness).
Second Method - RTSP Connection (implemented this afternoon):
- Installed the Theta RTSP Streaming plug-in on the Z1.
- Connected the Z1 via CL (currently tested on 2.4GHz).
- Identified the camera’s IP.
- In OBS, selected Media Source as the source and input the following URL: rtsp://user:pass@camIP:8554/live?resolution=1920x960.
(We can use diferent resolutions as stated in: THETA RTSP Streaming) - To stream to Unity, I’m still using the Virtual cam.
In terms of virtual reality, I’m using the HTC Vive Pro 2 kit in conjunction with Steam VR.
Currently, due to a considerable delay (I haven’t tested to determine the exact value yet), I’m attempting to use the 5GHz connection, but I’m not succeeding because the camera is not able to connect to the router. Does anyone has any ideas regarding this aspect?