The app framework issues requests for captured results to the camera subsystem. One request corresponds to one set of results. A request encapsulates all configuration information about the capturing and processing of those results. This includes things such as resolution and pixel format; manual sensor, lens, and flash control; 3A operating modes; RAW to YUV processing control; and statistics generation. This allows for much more control over the results' output and processing. Multiple requests can be in flight at once, and submitting requests is non-blocking. And the requests are always processed in the order they are received.
The camera subsystem includes the implementations for components in the camera pipeline such as the 3A algorithm and processing controls. The camera HAL provides interfaces for you to implement your versions of these components. To maintain cross-platform compatibility between multiple device manufacturers and Image Signal Processor (ISP, or camera sensor) vendors, the camera pipeline model is virtual and does not directly correspond to any real ISP. However, it is similar enough to real processing pipelines so that you can map it to your hardware efficiently. In addition, it is abstract enough to allow for multiple different algorithms and orders of operation without compromising either quality, efficiency, or cross-device compatibility.
The camera pipeline also supports triggers that the app framework can initiate to turn on things such as auto-focus. It also sends notifications back to the app framework, notifying apps of events such as an auto-focus lock or errors.
Please note, some image processing blocks shown in the diagram above are not well-defined in the initial release. The camera pipeline makes the following assumptions:
Summary of API use
This is a brief summary of the steps for using the Android camera API. See the
Startup and expected operation sequence section for a detailed breakdown of
these steps, including API calls.
HAL operation summary
This section contains a detailed explanation of the steps expected when using the camera API. Please see platform/hardware/interfaces/camera/ for HIDL interface definitions.
ICameraProvider
interface. If such provider or
providers are present, the framework will try to establish a connection.ICameraProvider::getCameraIdList()
.ICameraDevice
by calling the respective
ICameraProvider::getCameraDeviceInterface_VX_X()
.ICameraDevice::open()
to create a new
active capture session ICameraDeviceSession.ICameraDeviceSession::configureStreams()
with a list of input/output streams to the HAL device.ICameraDeviceSession::constructDefaultRequestSettings()
.
This may occur at any time after the ICameraDeviceSession
is
created by ICameraDevice::open
.
ICameraDeviceSession::processCaptureRequest()
.
The HAL must block the return of this call until it is ready for the next
request to be sent.ICameraDeviceSession::constructDefaultRequestSettings()
to get
default settings buffers for other use cases as necessary.ICameraDeviceCallback::notify()
with
the SHUTTER message, including the frame number and the timestamp for start
of exposure. This notify callback does not have to happen before the first
processCaptureResult()
call for a request, but no results are
delivered to an application for a capture until after
notify()
for that capture is called.
ICameraDeviceCallback::processCaptureResult()
.
These are returned in the same order as the requests were submitted. Multiple
requests can be in flight at once, depending on the pipeline depth of the
camera HAL device.After some time, one of the following will occur:
ICameraDeviceSession::configureStreams()
again. This resets the camera hardware and pipeline for a new set of
input/output streams. Some streams may be reused from the previous
configuration. The framework then continues from the first capture request
to the HAL, if at least one
registered output stream remains. (Otherwise,
ICameraDeviceSession::configureStreams()
is required first.)ICameraDeviceSession::close()
to end the camera session. This may be called at any time when no other calls
from the framework are active, although the call may block until all
in-flight captures have completed (all results returned, all buffers
filled). After the close()
call returns, no more calls to
ICameraDeviceCallback
are allowed from the HAL. Once the
close()
call is underway, the framework may not call any other
HAL device functions.ICameraDeviceCallback::notify()
with the appropriate
error/event message.
After returning from a fatal device-wide error notification, the HAL should
act as if close()
had been called on it. However, the HAL must
either cancel
or complete all outstanding captures before calling notify()
,
so that once
notify()
is called with a fatal error, the framework will not
receive further callbacks from the device. Methods besides
close()
should return
-ENODEV or NULL after the notify()
method returns from a fatal
error message.Camera devices can implement several hardware levels depending on their capabilities. For more information, see supported hardware level.
Depending on the settings in the 3A control block, the camera pipeline ignores some of the parameters in the application's capture request and uses the values provided by the 3A control routines instead. For example, when auto-exposure is active, the exposure time, frame duration, and sensitivity parameters of the sensor are controlled by the platform 3A algorithm, and any app-specified values are ignored. The values chosen for the frame by the 3A routines must be reported in the output metadata. The following table describes the different modes of the 3A control block and the properties that are controlled by these modes. See the platform/system/media/camera/docs/docs.html file for definitions of these properties.
Parameter | State | Properties controlled |
---|---|---|
android.control.aeMode | OFF | None |
ON | android.sensor.exposureTime android.sensor.frameDuration android.sensor.sensitivity android.lens.aperture (if supported) android.lens.filterDensity (if supported) | |
ON_AUTO_FLASH | Everything is ON, plus android.flash.firingPower, android.flash.firingTime, and android.flash.mode | |
ON_ALWAYS_FLASH | Same as ON_AUTO_FLASH | |
ON_AUTO_FLASH_RED_EYE | Same as ON_AUTO_FLASH | |
android.control.awbMode | OFF | None |
WHITE_BALANCE_* | android.colorCorrection.transform. Platform-specific adjustments if android.colorCorrection.mode is FAST or HIGH_QUALITY. | |
android.control.afMode | OFF | None |
FOCUS_MODE_* | android.lens.focusDistance | |
android.control.videoStabilization | OFF | None |
ON | Can adjust android.scaler.cropRegion to implement video stabilization | |
android.control.mode | OFF | AE, AWB, and AF are disabled |
AUTO | Individual AE, AWB, and AF settings are used | |
SCENE_MODE_* | Can override all parameters listed above. Individual 3A controls are disabled. |
The controls in the Image Processing block in Figure 2 all operate on a similar principle, and generally each block has three modes:
The maximum frame rate that can be supported by a camera subsystem is a function of many factors:
Since these factors can vary greatly between different ISPs and sensors, the camera HAL interface tries to abstract the bandwidth restrictions into as simple model as possible. The model presented has the following characteristics: