4

I am trying to understand ZSL feature/capability support on Android 5.0, from camera application, camera framework and libcameraservice implementation as well camera HAL v3.2 specifications.

As far as I understand, ZSL implementation in android, is possible in two ways:

  1. Framework implemented ZSL

  2. Application implemented ZSL

    • In Lollipop, they have introduced the concept of application implemented ZSL. ZSL has been exposed as a capability to the application, as per the available documentation http://androidxref.com/5.0.0_r2/xref/system/media/camera/docs/docs.html

    • Under android.request.availableCapabilities, it says that:
      For ZSL, "RAW_OPAQUE is supported as an output/input format"

In Lollipop, framework implemented ZSL works the same way as Kitkat, with Camera1 API application.

However, I could not find anywhere in Camera2 API application code, how to enable application/framework implemented ZSL. http://androidxref.com/5.0.0_r2/xref/packages/apps/Camera2/

Hence, the questions:

  1. Is it possible to enable framework implemented ZSL in Android L, with Camera2 API application?

  2. Is it possible to enable application implemented ZSL in Android L, without RAW_OPAQUE support, with Camera2 API application?

  3. If either 1 or 2 is possible, what is required from Camera HAL to enable ZSL in Android L?

Any help appreciated.

Sumit Agrawal
  • 113
  • 1
  • 7

1 Answers1

5
  1. No, the framework-layer ZSL only works with the old camera API.

  2. No, unless it's sufficient to use the output buffer as-is, without sending it back to the camera device for final processing.

The longer answer is that the ZSL reprocessing APIs had to be cut out of the initial camera2 implementation, so currently there's no way for an application to send buffers back to the camera device, in any format (RAW_OPAQUE or otherwise).

Some of the documentation in camera3.h is misleading relative to the actual framework implementation, as well - only IMPLEMENTATION_DEFINED BIDIRECTIONAL ZSL is supported by the framework, and RAW_OPAQUE is not used anywhere.

Edit: As of Android 6.0 Marshmallow, reprocessing is available in the camera2 API, on devices that support it (such as Nexus 6P/5X).

Eddy Talvala
  • 17,243
  • 2
  • 42
  • 47
  • 1
    Hi Eddy, Thanks for the answer. In future, when ZSL APIs are added in camera2, can limited mode camera devices (which are not capable of handling RAW_OPAQUE format) support ZSL feature? In that case, application can select appropriate IMPLEMENTATION_DEFINED format YUV buffer from ZSL stream, and send it back to the camera device for JPEG encoding (no other post-processing) – Sumit Agrawal Feb 11 '15 at 05:02
  • 2
    A baseline LIMITED device will not be required to support reprocessing, but it's one of the features a device can support either as a LIMITED or FULL device. Implementation_defined to JPEG is already working in the framework layers, so it's likely to work on future API additions. – Eddy Talvala Feb 17 '15 at 17:33