THETA V "camera.getLivePreview" request times out

Hi guys,

I am trying to acess Live Stream data via the THETA V web API. It is operating on API v2.1 with “firmwareVersion”: “3.50.1” in AP mode. Every function I try works fine, both through code and through Talend API Tester. However when I run the “camera.getLivePreview” command the request hangs and times out. Am I misunderstanding how to use this function? I scoured the forums and google, found a github repo of a Unity project and even some forks that updated the code to work in Theta V, but when I run the project my Unity Editor just hangs until the web request returns the timeout and I get no preview. Any ideas??
Thanks

pass it the header 'Content-Type`: ‘application/json; charset=utf-8’

You will get a stream as the response. You must listen to the stream, it is not a single json response. Parse the results of the stream to get the beginning and end of each frame then display it as JPEG or use a library that automatically displays the JPEG frames as video.

import 'dart:convert';
import 'dart:io';

void main() async {
  Uri apiUrl = Uri.parse('http://192.168.1.1/osc/commands/execute');
  var client = HttpClient();
  Map<String, String> body = {'name': 'camera.getLivePreview'};
  var request = await client.postUrl(apiUrl)
    ..headers.contentType = ContentType("application", "json", charset: "utf-8")
    ..write(jsonEncode(body));
  var response = await request.close();
  response.listen((List<int> data) {
    print(data);
  });
}

https://community.theta360.guide/tag/livepreview

I just tested this with the Z1 and it works great. I’ll test again with the V.

2 Likes

First of all thank you for your quick response.

It probably will work, I have tested with the THETA app and I can get the live preview, but when I try to access it through the API v2.1 using either Unity3D or Talend the web request times out. I don’t know how the THETA app implements the live preview, I tried using a solution that has been posted here https://community.theta360.guide/t/theta-s-wifi-streaming-with-unity/262/25 but the behavior is exactly the same… The Unity Editor just hangs.

In regards to keigom’s code, my program hangs on this line Stream stream = request.GetResponse ().GetResponseStream (); (Line 38 of ThetaVWifiApModeStreaming.cs). Basically I don’t receive the response from the web request, it doesn’t reach the point where the stream opens. In comparison to the code you posted, I believe it would hang on the line var response = await request.close();. I’m not quite sure what language that is in, I’ll try to figure out how I can run the code you posted to test it.

1 Like

It’s a HTTP POST request, same as camera.takePicture. However, unlike camera.takePicture, the response is a stream of data. The JPEG frames are encoded as bytes, so an array of integers.

This is documented in detail in these articles:

https://community.theta360.guide/tag/mjpeg

Depending on the HTTP library used, I have seen the response hang. You may need to specify a response type. The API reference indicates that the content type of the response is ‘Content-Type: multipart/x-mixed-replace’. In my test app, I just specified ‘stream’ as the library accepts that parameter, but I am not sure what the actual HTTP headers are.

Notice of website closure

It may be dependent on how the HTTP library you are using handles the response that it sends back to the client.

Try this one.

Instructions on use.

The V is going to be better on the stream than the SC2 used in the video.

1 Like

I see! I installed dart sdk and ran your code, it seems to be working… it just continuously prints out a series of integers, which I’m assuming is the intended result. Unlike the Unity/C# code it doesn’t hang, so I think it might be the response type that you mentioned. I’ll try to figure out how to properly handle it. Thanks a lot for your help!

2 Likes

Great news. yes, that is the intended result. I can send you the full application, but I believe that the simple POST example to give the stream is easier to understand.

1 Like

Okay, so I still don’t have a fully tested and proven solution yet, but I’m now able to see the stream. For anyone else wondering how to do this in Unity, here’s the basics.

I’m using:
Ricoh Theta V with firmware 3.50.1, working in AP mode (default) on API v2.1.
Unity 2019.4

First you will need to set up a custom DownloadHandler to read the data. I created one based on the answer in this StackOverflow post.

Here’s my implementation (there’s probably a better way to write it, I just hastily implemented this to verify if this is indeed the way to go about it):

ThetaStreamRequestHandler.cs

using UnityEngine;
using UnityEngine.Networking;
using System;

public class ThetaStreamRequestHandler : DownloadHandlerScript
{
    public event Action<Texture2D> Ev_FrameCaptured;
    public Texture2D LastFrame { get; private set; }

    // Standard scripted download handler - will allocate memory on each ReceiveData callback
    public ThetaStreamRequestHandler()
        : base()
    {
    }

    // Pre-allocated scripted download handler
    // Will reuse the supplied byte array to deliver data.
    // Eliminates memory allocation.
    public ThetaStreamRequestHandler( byte[] buffer )
        : base( buffer )
    {
    }

    // Required by DownloadHandler base class. Called when you address the 'bytes' property.
    protected override byte[] GetData() { return null; }

    private int counter = 0;
    private byte[] image = new byte[50000000];

    // Called once per frame when data has been received from the network.
    protected override bool ReceiveData( byte[] byteFromCamera, int dataLength )
    {
        if( byteFromCamera == null || byteFromCamera.Length < 1 )
        {
            //Debug.Log("CustomWebRequest :: ReceiveData - received a null/empty buffer");
            return false;
        }

        //Search of JPEG Image here
        for( int i = 0; i < dataLength; i++ )
        {
            if( counter > 2 && byteFromCamera[i] == 0xD9 && image[counter - 1] == 0xFF )
            {
                image[counter] = byteFromCamera[i];
                counter = 0;

                LastFrame = new Texture2D( 2, 2 );
                LastFrame.LoadImage( image );
                Ev_FrameCaptured?.Invoke( LastFrame );
            }

            if( ( counter == 0 && byteFromCamera[i] == 0xFF ) || ( counter == 1 && byteFromCamera[i] == 0xD8 ) || counter > 1 )
            {
                image[counter] = byteFromCamera[i];
                counter++;
            }
        }

        return true;
    }
}

Then you will need to set up a UnityWebRequest, which I believe should stay “alive” during the streaming duration. Something like this:

ThetaVZ1Streaming.cs

using UnityEngine;
using UnityEngine.Networking;

public class ThetaVZ1Streaming : MonoBehaviour
{
    [SerializeField] private Renderer _output;
    [SerializeField] private string _cameraIP = "192.168.1.1";

    private UnityWebRequest _request;

    [ContextMenu( "Start Streaming" )]
    public void StartStreaming()
    {
        _request = BuildCommandRequest( "{ \"name\": \"camera.getLivePreview\" }" );
        ( _request.downloadHandler as ThetaStreamRequestHandler ).Ev_FrameCaptured += OnFrameCaptured;
        _request.SendWebRequest();
    }

    [ContextMenu( "Stop Streaming")]
    public void StopStreaming()
    {
        _request.Dispose();
        _request = null;
    }

    private void OnDestroy()
    {
        StopStreaming();
    }

    private void OnFrameCaptured( Texture2D frame )
    {
        _output.material.mainTexture = frame;
    }

    private UnityWebRequest BuildCommandRequest( string jsonCommand )
    {
        UnityWebRequest uwr = new UnityWebRequest( $"http://{_cameraIP}/osc/commands/execute", "POST" );

        byte[] jsonToSend = new System.Text.UTF8Encoding().GetBytes( jsonCommand );

        uwr.uploadHandler = new UploadHandlerRaw( jsonToSend );
        uwr.downloadHandler = new ThetaStreamRequestHandler();

        uwr.SetRequestHeader( "Content-Type", "application/json;charset=utf-8" );

        return uwr;
    }
}

I created a Sphere, added ThetaVZ1Streaming to it and set the renderer as itself just to test. I believe this should be enough to get anyone on the right track on how to stream from Theta V/Z1 from inside Unity3D using the WebAPI. Sorry if this answer is already posted somewhere, I googled a lot and didn’t find any concrete solution though.

Thanks a lot to @craig for your insight, was really helpful!

2 Likes

Thanks for sharing this. I know it will be helpful to many people.

Did have to add the accept to the request header as {"Accept": "multipart/x-mixed-replace"} to get it to return a stream?

Did you also have to set {"X-XSRF-Protoected": 1} ?

That could be the solution to some problems I had with some libraries. Thanks.

I have not used the “X-XSRF-Protected: 1” before. Is that supposed to be part of the REQUEST header?

I can’t find it listed on the generic request header listing.

I’m curious as to what error you had before you implemented each line of the request header.

thanks.

Oh, I don’t remember why I added “X-XSRF-Protected: 1” header, I had that from a previous implementation for some reason. I’ll test removing the Accept and X-XSRF-Protected headers and see if it still works.

Edit: Just tested and indeed we don’t need those two headers. I updated the solution to reflect this.

1 Like

Thanks. There’s no rush. I was just curious because I’m trying to build some simple utilities to test the livePreview on different camera models and I’m also trying to find out what the RICOH THETA API requires in the header. The documentation on the official RICOH site is a bit sparse.

BTW, I am curious if you got the headset to move the JPEG frame orientation. That might be built into the Unity SDK automatically or have a better API to orient the 360 stream inside the headset. I have very limited experience with Unity.

I’m not actually implementing a VR viewer. But the way I set up my system it would work basically “plug & play”. If I understand your question correctly, here’s how it goes:

  1. I used a modeling software to create a sphere with inverse normals. You could also invert normals inside Unity on the default sphere, however I find Unity’s default sphere doesn’t really project the equirectangular photos well (not enough triangles and the UVs aren’t really well mapped either). You can also manually write a method that creates the sphere mesh. Another possibility is mapping the photo/stream to render as a Skybox, but I don’t enjoy this method because you lose control of zooming/scaling.

  2. Place a camera inside the sphere, adjust scales, distances (and FOV if not using VR). For non VR I implement a UnityEngine.EventSystems.IDragHandler that rotates the view. Either rotate the sphere or the camera (for interoperability with VR it would be better to rotate the camera). If rotating the camera it’s ideal to keep the camera beneath 2 other transforms, one for the X axis and the other for the Y axis, otherwise the rotations will go kind of crazy.

If using VR you don’t even need a script, just enable VR checkbox in Unity and the camera should automatically pick up the headset orientation. Since the camera is set up inside the sphere, it will update the orientation.

1 Like

Thanks for this information. There is interest in Unity in this community, but I have very little experience building Unity apps. It’s great to see more activity around Unity as it is such a powerful experience with the headset. :slight_smile:

Just an FYI, if you want to experiment with higher-resolution video from the V, you can also use the USB cable.

For getLivePreview, you can increase the resolution of the V to 1920x960 @8 fps. Though, that may cause motion sickness.

1 Like