startPreview doesn't seem to work

Hi,

I am trying to develop a plugin that uses the startPreview feature of the Camera API.
For some reason, the onPreviewFrame callback is never called.
I tried deploying the same app on an android phone and it works perfectly.

I’m pretty sure that I selected relevant and supported resolutions & formats, by using the parameter’s getPreviewSupportedFormats and getPreviewSupportedSizes.
I also set the setPreviewDisplay as the Surface I created, and used setPreviewCallback to set my callback. Eventually, I called the startPreview method of the camera, but still, the callback is just not getting called.

Again, it works perfectly on an Android phone so the code must be at least by Android rules.
I did call notify camera close before starting to do this.

Any idea what would cause it not to work?

I see some errors (not sure if related) in the logcat (there are too many to paste, but here are some):

2019-11-21 13:12:23.446 514-2611/? E/QCamera: <HAL><ERROR> int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &): 12941: int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &) No apropriate resolution found in list! w=6720 h=3360 stitch=0
2019-11-21 13:12:23.455 514-2624/? E/QCamera: <HAL><ERROR> int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &): 12971: int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &) Found res w=3648 h=3648 stitch=0
2019-11-21 13:12:23.455 514-2624/? E/QCamera: <HAL><ERROR> int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &): 12941: int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &) No apropriate resolution found in list! w=6720 h=3360 stitch=0
2019-11-21 13:12:23.455 514-2624/? E/QCamera: <HAL><ERROR> int32_t qcamera::QCameraParameters::getStreamFormat(cam_stream_type_t, cam_format_t &): 12744: Raw stream format 34 bundled with snapshot
2019-11-21 13:12:23.455 514-2624/? E/QCamera: <HAL><ERROR> int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &): 12934: int32_t qcamera::QCameraParameters::getStreamDimension(cam_stream_type_t, cam_dimension_t &) No apropriate resolution found in list! w=480 h=640 stitch=0

Thank you!
Roy.

Some examples with Live Preview working

Try and use one of the resolutions listed below.

image

Your values look a bit different

image

1 Like

Thank you so much for your response!
What I was missing was setting the mode to RicMoviePreview.
After I did that it started working.

However, the preview pictures are still not as well focused as they are when just taking normal still images.

My end goal is to get ~2fps, but at a very high (and very focused) quality.

I tried a few options, with just normal still image shooting, I can get ~1 image every 2 seconds (I need around x4 times faster than that) if I disable the equirectangular stitching, and remain with just fish-eye.

I was thinking that maybe JPEG compression is what makes it slow and was trying to shoot images RAW using the “rawsave-mode” parameter, but unfortunately, I wasn’t able to get it to work without also getting the JPEG in the callback function.
Also, if you query getSupportedPictureFormats - only JPEG is supported and the callback for Raw in takePicture is never called.

Can you think of any way to achieve a good, focused, high quality image, every ~0.5 seconds with the THETA Z1?

Thanks!
Roy.

1 Like

With the V, you can take a picture every second using the CameraAPI and unstitched images.

I have not measured this with the Z1.

this likely only works with JPG images, not DNG.

The camera stitches the image after each picture is taken, causing a delay.

I tested the Z1 with DNG images using the Wi-Fi API and it did not work to reduce the time below 6 seconds with DNG. It may work if you use JPG only.

Bottlenecks

  • Internal stitching time
  • DNG image

If you do additional tests with the Z1, please post your results.

When disabling both DNG, using:
mParameters.set(“RIC_DNG_OUTPUT_ENABLED”, 0);
mParameters.set(“rawsave-mode”, 0);

and stitching, using:
mParameters.set(“RIC_PROC_STITCHING”, “RicNonStitching”);

I still get a picture only every ~1.5s

Also,
I added debug prints in the “onShutter” calls, it is called twice, so my assumption is that the first one is when the shutter opens, and the second one is when the shutter is closed.
So basically you can see here:
D/debug: 2019-11-27 10:38:46.982 Shutter is called
D/debug: 2019-11-27 10:38:47.015 Shutter is called
The image taking process takes only ~50ms - all other time is spent internally on processing the image.

Again, I added debug prints to see if saving the image to the storage is the bottleneck:
D/debug: 2019-11-27 10:41:14.447 Start saving picture
D/debug: 2019-11-27 10:41:14.462 Done saving picture
Seems like saving the data only takes ~15ms - so definitely no.

The bottleneck is from when the shutter is closed, to until the point I get a byte array of a JPEG.
So either conversion from YV12/NV21->JPEG or reading the data from the controller is the bottleneck here.

Any suggestions? I wouldn’t mind getting the image as YV12/NV21, but it doesn’t seem like the Z1 has this option when taking still images.

Thanks.

1 Like