Live preview in JavaScript

Hi all,

I have a Ionic mobile application relying heavily on the use of a RICOH THETA S. My problem is that I need to show a live preview of the camera image. I’ve been trying to parse the result from getLivePreview using XMLHttpRequest and its responseText property, but I wasn’t able to show an actual image, only broken ones. My question is if there is any known solutions to this?

Best regards

Bertalan

1 Like

I think someone got it to work, but I can’t find the example.

Here’s an example using Unity.

You may be able to adapt the algorithm below.

List<byte> imageBytes = new List<byte> ();
bool isLoadStart = false; // 画像の頭のバイナリとったかフラグ
byte oldByte = 0; // 1つ前のByteデータを格納する
while( true ) {
    byte byteData = reader.ReadByte ();

    if (!isLoadStart) {
        if (oldByte == 0xFF){
            // First binary image
           imageBytes.Add(0xFF);
        }
        if (byteData == 0xD8){
           // Second binary image
           imageBytes.Add(0xD8);

           // I took the head of the image up to the end binary
           isLoadStart = true;
        }
    } else {
        // Put the image binaries into an array
        imageBytes.Add(byteData);

        // if the byte was the end byte
        // 0xFF -> 0xD9 case、end byte
        if(oldByte == 0xFF && byteData == 0xD9){
            // As this is the end byte
            // we'll generate the image from the data and can create the texture
            // imageBytes are used to reflect the texture
            // imageBytes are left empty
            // the loop returns the binary image head
            isLoadStart = false;
        }
    }
    oldByte = byteData;
}

This guy was working on a Cordova project. I don’t know if he got Live Preview to work.

This might also be helpful:

The guy below got it to work, but did not fully open his code:

From DavidC

the problem is that responseObject is not one image. You receive images continuously while live preview is on. And I was wrong to say that _getLivePreview returns “done” - for this command, it is not a dictionary.

Our code isn’t open sourced but here’s some of it. I have response serializers for both JSON and JPG/MP4 so I can read images and video data.

AFImageResponseSerializer *imageSerializer = [AFImageResponseSerializer serializer];
imageSerializer.acceptableContentTypes = [imageSerializer.acceptableContentTypes setByAddingObject:@"video/mp4"];

NSArray *responseSerializers = @[ [AFJSONResponseSerializer serializer], imageSerializer ];
manager.responseSerializer = [AFCompoundResponseSerializer compoundSerializerWithResponseSerializers:responseSerializers];

Each response contains one or more images (depends on iOS version, I think). So I split the response data by looking for JPG start and end bytes

__block NSMutableData *imageData = [[NSMutableData alloc] init];
[manager setDataTaskDidReceiveResponseBlock:^NSURLSessionResponseDisposition(NSURLSession *session, NSURLSessionDataTask *task, NSURLResponse *response) {
    // reset imageData at start of response
    [imageData replaceBytesInRange:NSMakeRange(0, imageData.length) withBytes:NULL length:0];
    return NSURLSessionResponseAllow;
}];

[manager setDataTaskDidReceiveDataBlock:^(NSURLSession *session, NSURLSessionDataTask *task, NSData *response) {
    [imageData appendData:response];

    // JPG starts with 0xFFD8 and ends with 0xFFD9
    // if imageData contains a complete JPG image
    //     - extract this JPG data
    //     - remove JPG data from imageData
    //     - draw it
    //        dispatch_async_main(^{
    //            [self updateLivePreviewImage:[UIImage imageWithData:jpgData]];
    //        });
}];

camera._getLivePreview is POST, like the others. The response is M-JPEG over HTTP. I don’t know if it returns “done” as well - I had to write all this before the Theta S was released. Even now, it’s not available to buy in the UK :o

I haven’t used HTML5, so I don’t know if an ArrayBufferView can be used for a response that isn’t a fixed size. I also saw a comment about using VIDEO tag for multipart/x-mixed-replace.

This code by gavinovsak saves pictures from the stream to /video

// theta test

var request = require("request");
var MjpegConsumer = require("mjpeg-consumer");
var FileOnWrite = require("file-on-write");

var writer = new FileOnWrite({ 
    path: './video',
    ext: '.jpg'
});
var consumer = new MjpegConsumer();

// get session id
var sessionId = '';

request({ 
   method: 'POST', 
   uri: 'http://192.168.1.1:80/osc/commands/execute', 
   body: JSON.stringify({
      name: "camera.startSession",
      parameters: {}
   })
}, function (error, response, body) {
  if (!error && response.statusCode == 200) {
     var result = JSON.parse(body);
     sessionId = result.results.sessionId;
    console.log(result, result.results.sessionId);

    console.log('getting the preview')
   request({ 
      method: 'POST', 
      uri: 'http://192.168.1.1:80/osc/commands/execute', 
      body: JSON.stringify({
         name: "camera._getLivePreview",
         parameters: {
            "sessionId": sessionId
         }
      })
   }).pipe(consumer).pipe(writer);

  } else {
     console.log(error, body);
  }
});
1 Like

Hi!

Thank you for your quick answer. Unfortunately none of the solutions above seem to be the one I am looking for. Although the last one (MjpegConsumer) does work in JavaScript, it only works in server environments (there is no way to make it work in browsers). My final idea is to create a custom Cordova plugin, which uses native Android and iOS code to get the frames from the live preview. I’ll let you know if I managed to do it, and probably publish the plugin once it is done.

Thanks

Bertalan

1 Like

That would be awesome if you got it to work. I think there’s interest in using Cordova, but I don’t think anyone has released an example of the live preview working. The mobile app would need this feature for many applications.

BTW, what are you using for the UI and the 360 image navigation? I did some tests with jQuery and A-Frame a while ago, but did not get satisfactory results. I think this was about a year ago. At the time, the WebVR support on mobile operating system was still undergoing change. In the JavaScript space, a year is a long time, so I’m wondering if most mobile phones can support WebVR now.

Using Cordova/PhoneGap would open up a lot of possibilities for many developers.

I am using a package called ricoh-theta-viewer (https://github.com/rajeshpanda/ricoh-theta-viewer) to show the captured images. It was made for an Ionic project, so it works great on smartphones. It is pretty basic, but does the work for me. I will probably change to something else later on, especially if I get the live preview working (since this viewer won’t be able to show a video stream).

2 Likes

Thank you. I will check this out.

That link was helpful. :grinning::theta::theta_s:

did you find a solution for the live preview of the camera image ?

Hi!

I have actually solved the live preview by writing my own plugin but I only implemented it to Android. Since I decided to rewrite my application natively for both platforms, I probably won’t be finishing up the iOS implementation. If you are interested however there is a public github repo: https://github.com/bertalan-radostyan/cordova-plugin-osc-livepreview

Maybe you can find something useful there :slight_smile:

2 Likes

This is great. Thanks for sharing the code.

I added a star to your repo! This is cool that you added the repo, really appreciate it.