Live preview in JavaScript

I think someone got it to work, but I can’t find the example.

Here’s an example using Unity.

You may be able to adapt the algorithm below.

List<byte> imageBytes = new List<byte> ();
bool isLoadStart = false; // 画像の頭のバイナリとったかフラグ
byte oldByte = 0; // 1つ前のByteデータを格納する
while( true ) {
    byte byteData = reader.ReadByte ();

    if (!isLoadStart) {
        if (oldByte == 0xFF){
            // First binary image
           imageBytes.Add(0xFF);
        }
        if (byteData == 0xD8){
           // Second binary image
           imageBytes.Add(0xD8);

           // I took the head of the image up to the end binary
           isLoadStart = true;
        }
    } else {
        // Put the image binaries into an array
        imageBytes.Add(byteData);

        // if the byte was the end byte
        // 0xFF -> 0xD9 case、end byte
        if(oldByte == 0xFF && byteData == 0xD9){
            // As this is the end byte
            // we'll generate the image from the data and can create the texture
            // imageBytes are used to reflect the texture
            // imageBytes are left empty
            // the loop returns the binary image head
            isLoadStart = false;
        }
    }
    oldByte = byteData;
}

This guy was working on a Cordova project. I don’t know if he got Live Preview to work.

This might also be helpful:

The guy below got it to work, but did not fully open his code:

From DavidC

the problem is that responseObject is not one image. You receive images continuously while live preview is on. And I was wrong to say that _getLivePreview returns “done” - for this command, it is not a dictionary.

Our code isn’t open sourced but here’s some of it. I have response serializers for both JSON and JPG/MP4 so I can read images and video data.

AFImageResponseSerializer *imageSerializer = [AFImageResponseSerializer serializer];
imageSerializer.acceptableContentTypes = [imageSerializer.acceptableContentTypes setByAddingObject:@"video/mp4"];

NSArray *responseSerializers = @[ [AFJSONResponseSerializer serializer], imageSerializer ];
manager.responseSerializer = [AFCompoundResponseSerializer compoundSerializerWithResponseSerializers:responseSerializers];

Each response contains one or more images (depends on iOS version, I think). So I split the response data by looking for JPG start and end bytes

__block NSMutableData *imageData = [[NSMutableData alloc] init];
[manager setDataTaskDidReceiveResponseBlock:^NSURLSessionResponseDisposition(NSURLSession *session, NSURLSessionDataTask *task, NSURLResponse *response) {
    // reset imageData at start of response
    [imageData replaceBytesInRange:NSMakeRange(0, imageData.length) withBytes:NULL length:0];
    return NSURLSessionResponseAllow;
}];

[manager setDataTaskDidReceiveDataBlock:^(NSURLSession *session, NSURLSessionDataTask *task, NSData *response) {
    [imageData appendData:response];

    // JPG starts with 0xFFD8 and ends with 0xFFD9
    // if imageData contains a complete JPG image
    //     - extract this JPG data
    //     - remove JPG data from imageData
    //     - draw it
    //        dispatch_async_main(^{
    //            [self updateLivePreviewImage:[UIImage imageWithData:jpgData]];
    //        });
}];

camera._getLivePreview is POST, like the others. The response is M-JPEG over HTTP. I don’t know if it returns “done” as well - I had to write all this before the Theta S was released. Even now, it’s not available to buy in the UK :o

I haven’t used HTML5, so I don’t know if an ArrayBufferView can be used for a response that isn’t a fixed size. I also saw a comment about using VIDEO tag for multipart/x-mixed-replace.

This code by gavinovsak saves pictures from the stream to /video

// theta test

var request = require("request");
var MjpegConsumer = require("mjpeg-consumer");
var FileOnWrite = require("file-on-write");

var writer = new FileOnWrite({ 
    path: './video',
    ext: '.jpg'
});
var consumer = new MjpegConsumer();

// get session id
var sessionId = '';

request({ 
   method: 'POST', 
   uri: 'http://192.168.1.1:80/osc/commands/execute', 
   body: JSON.stringify({
      name: "camera.startSession",
      parameters: {}
   })
}, function (error, response, body) {
  if (!error && response.statusCode == 200) {
     var result = JSON.parse(body);
     sessionId = result.results.sessionId;
    console.log(result, result.results.sessionId);

    console.log('getting the preview')
   request({ 
      method: 'POST', 
      uri: 'http://192.168.1.1:80/osc/commands/execute', 
      body: JSON.stringify({
         name: "camera._getLivePreview",
         parameters: {
            "sessionId": sessionId
         }
      })
   }).pipe(consumer).pipe(writer);

  } else {
     console.log(error, body);
  }
});
1 Like