aboutsummaryrefslogtreecommitdiff
path: root/en/devices
diff options
context:
space:
mode:
authorAndroid Partner Docs <noreply@android.com>2018-05-22 12:01:47 -0700
committerClay Murphy <claym@google.com>2018-05-22 15:00:17 -0700
commit4d251019ee1b9845580c3d916655c93400222be7 (patch)
treee4bd67b07d48efad341ecb4dc83a7223184a2c98 /en/devices
parent3c0356087cc57a0a4090de49b27374655b64fe7a (diff)
downloadsource.android.com-4d251019ee1b9845580c3d916655c93400222be7.tar.gz
Docs: Changes to source.android.com
- 197600621 Devsite localized content from translation request 873033... by Android Partner Docs <noreply@android.com> - 197596066 Update Support Library -> AndroidX (Support Library) by Christina Nguyen <cqn@google.com> - 197586819 HIDL: clarify documentation about documentation comments by Android Partner Docs <noreply@android.com> - 197579595 Devsite localized content from translation request 415584... by Android Partner Docs <noreply@android.com> - 197576890 Publish April and May Security/ Pixel bulletins by Danielle Roberts <daroberts@google.com> - 197559095 Update camera doc content by Kenneth Lau <kennethlau@google.com> - 197559031 Revise content to refer to HIDL interface by Kenneth Lau <kennethlau@google.com> - 197558968 Update camera images by Kenneth Lau <kennethlau@google.com> - 197492822 Devsite localized content from translation request 3347f0... by Android Partner Docs <noreply@android.com> - 197423991 Devsite localized content from translation request 35d612... by Android Partner Docs <noreply@android.com> - 197423968 Devsite localized content from translation request 845f73... by Android Partner Docs <noreply@android.com> - 197423953 Devsite localized content from translation request 415584... by Android Partner Docs <noreply@android.com> - 197423874 Devsite localized content from translation request 0a49fc... by Android Partner Docs <noreply@android.com> - 197423857 Devsite localized content from translation request c007e9... by Android Partner Docs <noreply@android.com> - 197423850 Devsite localized content from translation request 99ab33... by Android Partner Docs <noreply@android.com> - 197198470 Added CVE-2017-0740 to the Acknowledgements page. by Android Partner Docs <noreply@android.com> - 197180447 Replace KMSG with ramoops by Clay Murphy <claym@google.com> - 197056364 Improve readability of content by Kenneth Lau <kennethlau@google.com> - 197029559 Devsite localized content from translation request e1b1c4... by Android Partner Docs <noreply@android.com> - 197029538 Devsite localized content from translation request 697929... by Android Partner Docs <noreply@android.com> - 196891796 Add Nokia to the bulletin index page by Danielle Roberts <daroberts@google.com> - 196847852 Update CTS/CTS-Verifier downloads for CTS-May-2018 Releases by Android Partner Docs <noreply@android.com> - 196763369 Fix silly image path error by Clay Murphy <claym@google.com> - 196737640 Add blogs list to community page by Danielle Roberts <daroberts@google.com> - 196728891 Remove all non-test _freeze.yaml files. These are no long... by Android Partner Docs <noreply@android.com> - 196702405 Add initial bootloader docs by Clay Murphy <claym@google.com> - 196680338 Devsite localized content from translation request 54cf9d... by Android Partner Docs <noreply@android.com> PiperOrigin-RevId: 197600621 Change-Id: Ie86e9b753b488ba9af0f78fd6fcc577425ab3b94
Diffstat (limited to 'en/devices')
-rw-r--r--en/devices/_toc-interfaces.yaml10
-rw-r--r--en/devices/architecture/hidl/code-style.html19
-rw-r--r--en/devices/architecture/hidl/index.html3
-rw-r--r--en/devices/bootloader/flashing-updating.html192
-rw-r--r--en/devices/bootloader/images/bootloader_slotting.pngbin0 -> 83894 bytes
-rw-r--r--en/devices/bootloader/index.html46
-rw-r--r--en/devices/bootloader/partitions-images.html240
-rw-r--r--en/devices/bootloader/unlock-trusty.html188
-rw-r--r--en/devices/camera/camera3.html90
-rw-r--r--en/devices/camera/camera3_3Amodes.html773
-rw-r--r--en/devices/camera/camera3_crop_reprocess.html29
-rw-r--r--en/devices/camera/camera3_error_stream.html154
-rw-r--r--en/devices/camera/camera3_metadata.html59
-rw-r--r--en/devices/camera/camera3_requests_hal.html419
-rw-r--r--en/devices/camera/camera3_requests_methods.html106
-rw-r--r--en/devices/camera/images/ape_fwk_camera2.pngbin0 -> 43971 bytes
-rw-r--r--en/devices/camera/images/camera-hal-overview.pngbin51184 -> 53489 bytes
-rw-r--r--en/devices/camera/index.html113
-rw-r--r--en/devices/camera/versioning.html76
19 files changed, 1606 insertions, 911 deletions
diff --git a/en/devices/_toc-interfaces.yaml b/en/devices/_toc-interfaces.yaml
index 504b9b72..5f91e99d 100644
--- a/en/devices/_toc-interfaces.yaml
+++ b/en/devices/_toc-interfaces.yaml
@@ -223,6 +223,16 @@ toc:
path: /devices/bluetooth/verifying_debugging
- title: HCI Requirements
path: /devices/bluetooth/hci_requirements
+- title: Bootloader
+ section:
+ - title: Overview
+ path: /devices/bootloader
+ - title: Partitions and Images
+ path: /devices/bootloader/partitions-images
+ - title: Flashing and Updating
+ path: /devices/bootloader/flashing-updating
+ - title: Unlocking and Trusty
+ path: /devices/bootloader/unlock-trusty
- title: Camera
section:
- title: Overview
diff --git a/en/devices/architecture/hidl/code-style.html b/en/devices/architecture/hidl/code-style.html
index 0746ff36..5f787d71 100644
--- a/en/devices/architecture/hidl/code-style.html
+++ b/en/devices/architecture/hidl/code-style.html
@@ -374,14 +374,21 @@ fine.</p>
<li>TODOs</li>
</ul>
</li>
-<li>Use <code>/** */</code> mainly for Function documents/"docstrings" used for
-generated documentation. Example:
+<li>Use <code>/** */</code> for generated documentation. These can be applied
+only to type, method, field, and enum value declarations. Example:
<pre class="prettyprint">
/** Replied status */
-enum FooStatus {
- OK = 0, // no error
- ERR_TRANSPORT = 1, // transport level error
- ERR_ARG = 2 // invalid args
+enum TeleportStatus {
+ /** Object entirely teleported. */
+ OK = 0,
+ /** Methods return this if teleportation is not completed. */
+ ERROR_TELEPORT = 1,
+ /**
+ * Teleportation could not be completed due to an object
+ * obstructing the path.
+ */
+ ERROR_OBJECT = 2,
+ ...
}
</pre>
<li>Multi-line comments should start a new line with <code>/**</code>, use
diff --git a/en/devices/architecture/hidl/index.html b/en/devices/architecture/hidl/index.html
index 5790b78c..35b166dd 100644
--- a/en/devices/architecture/hidl/index.html
+++ b/en/devices/architecture/hidl/index.html
@@ -135,7 +135,8 @@ of <code>=</code> and <code>|</code>) is part of the grammar.</p>
<a href="code-style.html">Code Style Guide</a>.</p>
<ul>
-<li><code>/** */</code> indicates a documentation comment.</li>
+<li><code>/** */</code> indicates a documentation comment. These can be applied
+only to type, method, field, and enum value declarations.</li>
<li><code>/* */</code> indicates a multiline comment.</li>
<li><code>//</code> indicates a comment to end of line. Aside from
<code>//</code>, newlines are the same as any other whitespace. </li>
diff --git a/en/devices/bootloader/flashing-updating.html b/en/devices/bootloader/flashing-updating.html
new file mode 100644
index 00000000..5fdc5aa9
--- /dev/null
+++ b/en/devices/bootloader/flashing-updating.html
@@ -0,0 +1,192 @@
+<html devsite>
+ <head>
+ <title>Flashing, Booting, and Updating</title>
+ <meta name="project_path" value="/_project.yaml" />
+ <meta name="book_path" value="/_book.yaml" />
+ </head>
+ <body>
+ <!--
+ Copyright 2017 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+ -->
+ <h2 id="flashing-images">Flashing images</h2>
+<p>
+The <code>flash</code> command should not erase the partition unless the host
+<code>fastboot</code> tool sends an <code>erase</code> command first. This
+allows flashing a very large partition in multiple smaller chunks using multiple
+sparse images that start with a "skip" block to seek over the already-written
+area. Creating these images on the fly is already handled by the
+<code>fastboot</code> host side tool.
+</p>
+<p>
+Sanity checks should be done on radio and bootloader images prior to flashing in
+unlocked mode. For examples, compare to <code>android-info.txt</code> created
+from the build and confirm the version matches. Also check bootloader image
+signature at flash time to make sure it will pass validation during boot (which
+may include anti-rollback features).
+</p>
+<p>
+On Google-branded devices, flashing to older versions of
+bootloaders should work properly, starting from the first commercially-shipped
+bootloader -- ideally sooner.
+</p>
+<h2 id="booting-kernel-command-line">Booting: kernel command line</h2>
+<p>
+The kernel command line should be concatenated together from the following
+locations:
+</p>
+<ul>
+<li>bootloader command line: set of static and dynamic parameters determined by
+ the bootloader</li>
+<li>device-tree: from the chosen/bootargs node</li>
+<li><code>defconfig</code>: from CONFIG_CMDLINE</li>
+<li><code>boot.img</code>: from cmdline (see
+<a
+ href="https://android.googlesource.com/platform/system/core/+/master/mkbootimg/include/bootimg/bootimg.h"
+ class="external"><code>system/core/mkbootimg/bootimg.h</code></a> for offsets and
+ sizes)</li>
+<li>A canonical reboot or shutdown reason compliant with the <a
+href="/compatibility/cdd">Android Compatibility Definition Document</a> as
+determined via PMIC (power management integrated circuit), other hardware
+resources and reboot magic arguments (<code>LINUX_REBOOT_CMD_RESTART2</code>)
+messaging, to land as:
+<code>androidboot.bootreason=&lt;reason&gt;</code></li>
+
+</ul>
+<h2 id="implementing-verified-boot">Implementing verified boot</h2>
+<p>
+Refer to <a
+href="/security/verifiedboot/verified-boot.html">Verifying
+Boot.</a>
+</p>
+<h2 id="supporting-updates">Supporting updates</h2>
+<p>
+To support the Google <a
+href="/devices/tech/ota/">over-the-air</a> (GOTA)
+update process a recovery RAM disk must be present.
+</p>
+
+<p>
+If the standard AOSP recovery image is being used, during boot the bootloader
+should read the first 32 bytes on the misc partition and boot into the recovery
+image if the data there matches: "boot-recovery" This allows any pending
+recovery work (e.g. applying an OTA, performing data removal, etc.) to resume
+until being finished successfully.
+</p>
+<p>
+See <a
+href="https://android.googlesource.com/platform/bootable/recovery/+/master/bootloader_message/include/bootloader_message/bootloader_message.h#64"
+class="external"><code>bootable/recovery/bootloader_message/bootloader_message.h</code></a>
+for details regarding the content of a block in flash that is used for recovery
+and the bootloader to communicate.
+</p>
+<h3 id="a-b-updates">A/B updates</h3>
+<p>
+If the OEM chooses to support <a href="/devices/tech/ota/ab/">A/B
+updates</a> for a given device, the bootloader should meet the following
+criteria:
+</p>
+<ul>
+<li>All the partitions that get updated through an OTA should be updatable
+while the main system is booted and not through recovery.
+<li>For A/B updates, the updater will query the <a href="/reference/hidl/android/hardware/boot/1.0/IBootControl">boot control HAL</a>, update the
+boot slot not currently in use, change the active slot through the HAL, and
+reboot into the updated operating system. See <a href="/devices/tech/ota/ab/ab_implement#bootcontrol" class="external">Implementing the boot control HAL</a>
+<li>All partitions that support A/B will have a suffix appended to their name.
+This differentiates the partitions belonging to a particular slot in the
+bootloader. For each such partition, there is a corresponding variable
+<code>has-slot:<partition base name></code> with a value of: "yes"
+<li>Slots are named alphabetically as a, b, c, etc. corresponding to partitions
+with the suffix _a, _b, _c, etc.
+<li>The bootloader should in one of these ways to inform the operating system
+which slot was booted: <ul>
+ <li>DT property:
+<code>/firmware/android/slot_suffix</code>
+Or:
+ <li>Command line property:
+<code>androidboot.slot_suffix</code>
+</ul>
+<li>The bootloader should support the boot_control HAL
+(<code>hardware/libhardware/include/hardware/boot_control.h</code>).
+<li>To boot <code>/system</code> under A/B, the bootloader should pass <code>ro
+root=/dev/[node] rootwait skip_initramfs init=/init</code> on the kernel command
+line.
+Not passing skip_initramfs will boot to recovery.
+<li><code>slot-retry-count</code> is reset to a positive value (usually "3")
+either by the boot control HAL through the <code>setActiveBootSlot</code>
+callback or through the <code>fastboot set_active</code> command.
+<li>When modifying a partition that is part of a slot, the bootloader shall
+clear "successfully booted" and reset the retry_count for the slot in
+question.</li>
+</li>
+<li>
+The bootloader should also determine which slot to load. See the diagram within
+this section for a depiction of the decision flow; the general steps are:
+</li><ol>
+<li>Determine which slot to attempt. Do not attempt to load a slot marked
+"slot-unbootable". This slot should be consistent with the values returned by
+fastboot, and will be referred to as the current slot from now on.
+<li>Is the current slot not marked as <code>slot-successful</code> AND
+ <code>slot-retry-count</code> = 0?<br/>
+Mark the current slot as "slot-unbootable", and select a different slot
+that is not marked "unbootable" and is marked as "slot-successful". This slot is
+now the selected slot. If no current slot is available, boot to recovery or
+display a meaningful error message to the user.
+<li>Select appropriate boot.img and include path to correct system partition on
+kernel command line.
+<li>If not booting recovery, add skip_initramfs to kernel command line
+<li>Populate DT or command line slot_suffix parameter
+<li>Boot. If not marked "slot-successful", decrement slot-retry-count.
+<figure id="bootloader-flow">
+ <img src="/devices/bootloader/images/bootloader_slotting.png"
+ width="786"
+ alt="Bootloader slotting flow">
+ <figcaption><b>Figure 1.</b> Bootloader slotting flow</figcaption>
+</figure>
+</li></ol>
+<li>The fastboot utility will determine which partition to
+flash when running any of the flashing commands: for example <code>fastboot
+flash system system.img</code> will first query the
+<code>current-slot</code> variable then concatenate the result to
+system to generate the name of the partition that should be flashed (eg system_a
+or system_b)
+<li>When setting the current slot via fastboot <code>set_active</code> or the
+boot control HAL's <code>setActiveBootSlot</code>, the bootloader should update
+the current slot, clear <code>slot-unbootable</code>, clear
+<code>slot-successful</code>, and reset the <code>retry-count</code>. These are
+the only ways to clear <code>slot-unbootable</code>.
+<li>It is the responsibility of the Android framework to call
+<code>markBootSuccessful</code> from the HAL. The bootloader should
+never mark a partition as successfully booted.</li>
+</ul>
+<h3 id="gota-updates-with-recovery-mode">Non-A/B updates</h3>
+<p><a href="/devices/tech/ota/nonab/">Non-A/B updatable</a> devices should meet
+these criteria to support updates:</p>
+<ul>
+<li>The recovery partition should contain an image that is capable of reading a
+system image from some supported partition (cache, userdata) and writing it
+to the system partition.
+<li>The bootloader should support rebooting directly into recovery mode.
+<li>If radio image updates are supported, the recovery partition should also be
+able to flash the radio. This can be accomplished in one of two ways:
+<ul>
+ <li>The bootloader flashes the radio. In this case, it should be possible to
+reboot from the recovery partition back into the bootloader to complete the
+update.
+ <li>The recovery image flashes the radio. This functionality may be
+provided in the form of a binary library or utility.</li></ul>
+</li>
+</ol>
+</body>
+</html>
diff --git a/en/devices/bootloader/images/bootloader_slotting.png b/en/devices/bootloader/images/bootloader_slotting.png
new file mode 100644
index 00000000..25d4b0f0
--- /dev/null
+++ b/en/devices/bootloader/images/bootloader_slotting.png
Binary files differ
diff --git a/en/devices/bootloader/index.html b/en/devices/bootloader/index.html
new file mode 100644
index 00000000..57705a1b
--- /dev/null
+++ b/en/devices/bootloader/index.html
@@ -0,0 +1,46 @@
+<html devsite>
+ <head>
+ <title>Bootloader</title>
+ <meta name="project_path" value="/_project.yaml" />
+ <meta name="book_path" value="/_book.yaml" />
+ </head>
+ <body>
+ <!--
+ Copyright 2017 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+ -->
+
+
+<p>
+A bootloader is a vendor-proprietary image responsible for bringing up the
+kernel on a device. It guards the device state and is responsible for
+initializing the <a href="/security/trusty/">Trusted
+Execution Environment (TEE)</a> and binding its root of trust.
+</p>
+<p>
+The bootloader is comprised of many things including splash screen. To
+start boot, the bootloader may directly flash a new image into an appropriate
+partition or optionally use <code>recovery</code> to start the reflashing
+process that will match how it is done for OTA. Some device manufacturers create
+multi-part bootloaders and then combine them into a single bootloader.img file.
+At flash time, the bootloader extracts the individual bootloaders and flashes
+them all.
+</p>
+<p>
+Most importantly, the bootloader verifies the integrity of the boot and recovery
+partitions before moving execution to the kernel and displays the warnings
+specified in the section <a
+href="/security/verifiedboot/verified-boot#boot_state">Boot state</a>.</p>
+</body>
+</html>
diff --git a/en/devices/bootloader/partitions-images.html b/en/devices/bootloader/partitions-images.html
new file mode 100644
index 00000000..7f171fef
--- /dev/null
+++ b/en/devices/bootloader/partitions-images.html
@@ -0,0 +1,240 @@
+<html devsite>
+ <head>
+ <title>Partitions and Images</title>
+ <meta name="project_path" value="/_project.yaml" />
+ <meta name="book_path" value="/_book.yaml" />
+ </head>
+ <body>
+ <!--
+ Copyright 2017 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+ -->
+
+
+<h2 id="partitions">Partitions</h2>
+<p>
+Android devices include several partitions that serve different functions in the
+boot process. To support <a href="/devices/tech/ota/ab/">A/B
+updates</a>, the device will need one slot per partition for
+<code>boot</code>, <code>system</code>, <code>vendor</code>, and
+<code>radio</code>.
+</p>
+<ul>
+<li><strong>boot</strong>: The <code>boot</code> partition contains a kernel
+image and a RAM disk combined via <code>mkbootimg</code>. In order to flash the
+kernel directly without flashing a new boot partition, a virtual partition can
+be used: <ul>
+ <li><strong>kernel</strong>: The virtual <code>kernel</code> partition
+overwrites only the kernel (zImage, zImage-dtb, Image.gz-dtb) by writing the new
+image over the old one. To do this, it determines the start location of the
+existing kernel image in eMMC and copies to that location, keeping in mind that
+the new kernel image may be larger than the existing one. The bootloader can
+either make space by moving any data following it or abandoning the operation
+with an error. If the development kernel supplied is incompatible, you may need
+to update the dtb partition if present, or vendor or system partition with
+associated kernel modules.
+ <li><strong>ramdisk</strong>: The virtual <code>ramdisk</code> partition
+overwrites only the RAM disk by writing the new image over the old one. To do
+this, it determines the start location of the existing <code>ramdisk.img</code>
+in eMMC and copies to that location, keeping in mind that the new RAM disk maybe
+be larger than the existing one. The bootloader can either make space by moving
+any data following it or abandon the operation with an error.</ul>
+<li><strong>system</strong>: The <code>system</code> partition mainly contains
+the Android framework.
+<li><strong>recovery</strong>: The <code>recovery</code> partition stores the
+recovery image, booted during the OTA process. If the device supports <a
+href="/devices/tech/ota/ab/">A/B updates</a>,
+recovery can be a RAM disk contained in the boot image rather than a separate
+image.
+<li><strong>cache</strong>: The <code>cache</code> partition stores temporary
+data and is optional if a device uses A/B updates. The cache partition doesn't
+need to be writable from the bootloader, only erasable. The size depends on the
+device type and the availability of space on userdata. Currently 50MB-100MB
+should be ok.
+<li><strong>misc</strong>: The <code>misc</code> partition is used by recovery
+and is 4KB or larger.
+<li><strong>userdata</strong>: The <code>userdata</code> partition contains
+user-installed applications and data, including customization data.
+<li><strong>metadata</strong>: The <code>metadata</code> partition is used when
+device is encrypted and is 16MB or larger.
+<li><strong>vendor</strong>: The <code>vendor</code> partition contains any
+binary that is not distributable to the Android Open Source Project (AOSP). If
+there is no proprietary information, this partition may be omitted.
+<li><strong>radio</strong>: The <code>radio</code> partition contains the radio
+image. This partition is only necessary for devices that include a radio that
+have radio-specific software in a dedicated partition.
+<li><strong>tos</strong>: The <code>tos</code> partition stores the binary image
+of the Trusty OS and is only used if the device includes Trusty.</li>
+</li></ul>
+<h2 id="flow">Flow</h2>
+<p>
+Here is how the bootloader operates:
+</p><ol>
+<li>The bootloader gets loaded first.
+<li>The bootloader initializes memory.
+<li>If <a href="/devices/bootloader/flashing-updating#a-b-updates">A/B
+updates</a> are used, determine the current slot to boot.
+<li>Determine whether recovery mode should be booted instead as described in <a
+href="/devices/bootloader/flashing-updating#supporting-updates">Supporting
+updates</a>.
+<li>The bootloader loads the image, which contains the kernel and RAM disk (and
+in <a href="/devices/architecture/treble">Treble</a> even more).
+<li>The bootloader starts loading the kernel into memory as a self-executable
+compressed binary.
+<li>The kernel decompresses itself and starts executing into memory.
+<li>From there on, older devices load <code>init</code> from the RAM disk and
+newer devices load it from the <code>/system</code> partition.
+<li>From <code>/system</code>, <code>init</code> launches and starts mounting
+all the other partitions, such as <code>/vendor</code>, <code>/oem</code>, and
+<code>/odm</code>, and then starts executing code to start the device</li>
+</ol>
+<h2 id="images">Images</h2>
+<p>
+The bootloader relies upon these images.
+</p>
+<h3 id="kernel-images">Kernel images</h3>
+<p>
+Kernel images are created in a standard Linux format, such as zImage, Image, or
+Image.gz. Kernel images can be flashed independently, combined with RAM disk
+images, and flashed to the boot partition or booted from memory. When creating
+kernel images, concatenated device-tree binaries are recommended over using a
+separate partition for the device tree. When using multiple Device Tree Blobs
+(DTBs) for different board revisions, concatenate multiple DTBs in descending
+order of board revision.
+</p>
+<h3 id="ram-disk-images">RAM disk images</h3>
+<p>
+RAM disks should contain a root file system suitable for mounting as a rootfs.
+RAM disk images are combined with kernel images using mkbootfs and then flashed
+into the boot partition.
+</p>
+<h3 id="boot-images">Boot images</h3>
+<p>
+Boot images should contain a kernel and RAM disk combined using an unmodified
+<code>mkbootimg</code>.
+</p>
+<p>
+The <code>mkbootimg</code> implementation can be found at: <code><a
+href="https://android.googlesource.com/platform/system/core/+/master/mkbootimg">system/core/mkbootimg</a></code>
+</p>
+<p>
+The bootloader reads the <code><a
+href="https://android.googlesource.com/platform/system/core/+/master/mkbootimg/bootimg.h">bootimg.h</a></code>
+header file generated by mkbootimg and updates the kernel header to contain the
+correct location and size of the RAM disk in flash, base address of the kernel,
+command line parameters, and more. The bootloader then appends the command line
+specified in the boot image to the end of the bootloader-generated command
+line.
+
+<h3 id="file-system-images-system-userdata-recovery">File system images (system,
+userdata, recovery)</h3>
+<h4 id="sparse-image-format">YAFFS2 image format</h4>
+<p>
+If using raw NAND storage, these images must be YAFFS2, generated by an unmodified mkyaffs2image, as found in the Android Open Source Project (AOSP) at <a href="https://android.googlesource.com/platform/external/yaffs2/+/master/yaffs2/utils/" class="external"><code>external/yaffs2/yaffs2/utils</code></a>. They have the format:
+<pre>
+<code>
+| 2k bytes of data| yaffs extra data | padding | | 0 2048 | 0 64 | variable|
+</code>
+</pre>
+<p>
+The bootloader is responsible for consuming these images and relocating the
+yaffs extra data into the appropriate location in the out-of-band area for the
+given nand hardware. If software ECC is required, the bootloader should also
+do that computation at this time.
+</p>
+<h4 id="sparse-image-format">Sparse image format</h4>
+<p>
+The sparse image format should be supported. It is described in the document
+"ext4 compressed images" and in <code><a
+href="https://android.googlesource.com/platform/system/core/+/master/libsparse/sparse_format.h">system/core/libsparse/sparse_format.h</a></code>;
+it is implemented in: <code><a
+href="https://android.googlesource.com/platform/system/core/+/master/libsparse/sparse_read.cpp">system/core/libsparse/sparse_read.cpp</a></code>
+</p>
+<p>
+If using a block-based storage device, ext4 or f2fs should be supported. To
+quickly transfer and flash large, empty ext4 file systems (userdata), store the
+image in a sparse format that contains information about which areas of the file
+system can be left unwritten. The file format is written by the
+<code>mke2fs</code> utility that is also used to create the images the file
+format is read and flashed by the bootloader. See the sections below for
+attributes:
+</p>
+<p>
+
+<h5 id="file-format">File format</h5>
+<ul>
+<li>All fields are unsigned little-endian
+<li>The file contains a file header, followed by a series of chunks
+<li>The file header, chunk header, and chunk data are all multiples of 4 bytes
+long</li></ul>
+<h5 id="header">Header</h5>
+<ul>
+<li>32-bit magic: 0xed26ff3a
+<li>16-bit major version (0x1) - reject images with higher major versions
+<li>16-bit minor version (0x0) - allow images with higher minor versions
+<li>16-bit file header size in bytes (28 in v1.0)
+<li>16-bit chunk header size in bytes (12 in v1.0)
+<li>32-bit block size in bytes, must be multiple of 4
+<li>32-bit total blocks in output file
+<li>32-bit total chunks in input file</li></ul>
+<p>
+32-bit CRC32 checksum of original data, counting "don't care" as 0 Standard
+802.3 polynomial, use a public domain table implementation
+</p>
+<h5 id="chunk">Chunk</h5>
+<ul>
+<li>16-bit chunk type:
+ <ul>
+ <li>0xCAC1 raw
+ <li>0xCAC2 fill
+ <li>0xCAC3 don't care
+ </ul>
+<li>16 bits reserved (write as 0, ignore on read)
+<li>32-bit chunk size in blocks in output image
+<li>32-bit total size in bytes of chunk input file including chunk header and
+data</li></ul>
+<h5 id="data">Data</h5>
+<ul>
+<li> for raw, raw data, size in blocks * block size in bytes
+<li> for fill, 4 bytes of fill data</li></ul>
+<h5 id="implementing-the-writer">Implementing the writer</h5>
+<p>
+The <code>mke2fs</code> utility already knows what areas of the image need
+to be written, and will encode "don't care" chunks between them. Another tool,
+<code>img2simg</code>, will convert regular (non-sparse) images to sparse
+images. Regular images have no information about "don't care" areas; the best a
+conversion can do is look for blocks of repeated data to reduce the resulting
+image size.
+</p>
+<h5 id="implementing-the-reader">Implementing the reader</h5>
+<p>
+Readers should reject images with unknown major versions and should accept images
+with unknown minor versions. Readers may reject images with chunk sizes they do
+not support.
+</p>
+<p>
+Once the major version is validated, the reader should ignore chunks with unknown
+type fields. It should skip over the chunk in the file using the "chunk size in
+file" and skip "chunk size in blocks" blocks on the output.
+</p>
+<p>
+A Cyclic Redundancy Check - 802.3 CRC32 - should be calculated for the data that
+will be written to disk. Any area that is not written (don't care, or a skipped
+chunk), should be counted as 0s in the CRC. The total number of blocks written
+or skipped should be compared against the "total blocks" field in the header.
+The tool <code>simg2img</code> will convert the sparse image format to a
+standard image, which will lose the sparse information.
+</p>
+</body>
+</html>
diff --git a/en/devices/bootloader/unlock-trusty.html b/en/devices/bootloader/unlock-trusty.html
new file mode 100644
index 00000000..45d48636
--- /dev/null
+++ b/en/devices/bootloader/unlock-trusty.html
@@ -0,0 +1,188 @@
+<html devsite>
+ <head>
+ <title>Using the Bootloader</title>
+ <meta name="project_path" value="/_project.yaml" />
+ <meta name="book_path" value="/_book.yaml" />
+ </head>
+ <body>
+ <!--
+ Copyright 2017 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+ -->
+
+
+<h2 id="unlock">Unlocking and Trusty</h2>
+<h3 id="recommendations">Recommendations</h3>
+<p>
+All Google-branded devices should be made unlockable so that all partitions
+listed above can be reflashed.  This unlocked mode is set with <code>fastboot
+flashing unlock</code>, and once set this mode should persist across reboots.
+</p>
+<p>
+Devices should deny the <code>fastboot flashing unlock</code> command
+unless<code> fastboot flashing get_unlock_ability </code>is: "1" If
+<code>get_unlock_ability</code> is "0" the user needs to boot to the home
+screen, go into the <em>Settings > System > <a
+href="https://developer.android.com/studio/debug/dev-options.html">Developer
+options</a></em> menu and enable the <strong>OEM unlocking</strong> option to
+set <code>unlock_ability</code> to: "1" That flag should be persistent across
+reboots and across factory data resets.
+</p>
+<p>
+When the <code>fastboot flashing unlock</code> command is sent, the device should
+prompt users to warn them that they may encounter problems with unofficial
+images. After acknowledging, a factory data reset should be done to prevent
+unauthorized data access. The bootloader should reset the device even if it is not
+able to reformat it properly. Only after reset can the persistent flag be set so
+that the device can be reflashed.
+</p>
+<p>
+The <code>fastboot flashing lock</code> command relocks and resets the device so
+that future flash/unlock attempts require another data reset.
+</p>
+<p>
+All RAM not already overwritten should be reset during the <code>fastboot flashing
+unlock</code> process. This measure prevents attacks that read leftover RAM
+contents from the previous boot. Similarly, unlocked devices should clear RAM at
+every boot if this does not create an unacceptable delay, but should leave the
+region used for the kernel's <a
+href="https://www.kernel.org/doc/html/v4.12/admin-guide/ramoops.html"
+class="external"><code>ramoops</code></a>.
+</p>
+<p>
+Devices intended for retail should be shipped in the locked state (and with
+<code>get_unlock_ability</code> returning "0"). This is to ensure an attacker
+cannot compromise the device by installing their own system or boot image.
+</p>
+<h3 id="properties">Properties</h3>
+<p>
+The<code> ro.oem_unlock_supported</code> property should be set at build time
+based on whether the device supports flashing unlock.
+<code>ro.oem_unlock_supported</code> should be set to "0" if flashing unlock is
+not supported on the device or "1" if flashing unlock is supported.
+</p>
+<p>
+If the device supports flashing unlock (i.e. <code>ro.oem_unlock_supported =
+1</code>), then the bootloader should indicate the lock status by setting the
+kernel command line variable <code>androidboot.flash.locked</code> (or the
+<code>/firmware/android/flash.locked</code> DT property) to "1" if locked or "0"
+if unlocked.
+</p>
+<p>
+<strong>Note:</strong> For devices that support <a
+href="/security/verifiedboot/dm-verity">dm-verity</a>,
+you can instead use <code>ro.boot.verifiedbootstate</code> to set the value of
+<code>ro.boot.flash.locked</code> where the value is "0" i.e. unlocked if the
+verified boot state is orange.
+</p>
+<h3 id="flashing-lock-unlock_critical">Flashing lock/unlock_critical</h3>
+<p>
+The device should support locking and unlocking of critical sections. Those
+critical sections are defined as whatever is needed to boot the device into the
+bootloader. This might include fuses, virtual partitions for a sensor hub,
+first-stage bootloader, and more.
+</p>
+<p>
+Locking of critical sections is defined as preventing any code (kernel, recovery
+image, OTA code, etc.) running on the device from deliberately modifying any
+critical section. This implies that OTAs should fail to update critical sections
+if the device is in lock critical state. Transitioning from locked to unlocked
+state should require a physical interaction with the device.
+</p>
+<p>
+The physical interaction is similar to what <code>fastboot flashing
+unlock</code> would cause: the user would have to press some physical buttons on the
+device. The design should not allow programmatically transitioning from
+<code>lock critical</code> to <code>unlock critical </code>without physical
+interaction. Devices should ship in the <code>unlock critical</code> state.
+</p>
+<h2 id="designation-of-critical-partitions-data">Designation of critical
+partitions/data</h2>
+<p>
+Any partitions or data needed for the device to run, need to be either:
+</p><ul>
+<li>Re-flashable - either re-buildable, provided, or extractable via some
+<code>fastboot oem</code> command
+<li>fully-protected (i.e. considered critical per the previous
+section)</li></ul>
+<p>
+This includes per-device factory-specific settings, serial numbers, calibration
+data, etc.
+</p>
+<h2 id="off-mode-charging">Off-mode charging</h2>
+<p>
+If a device supports "off-mode charging" or otherwise autoboots into a
+special mode when power is applied, <code>fastboot oem off-mode-charge 0</code>
+should bypass these special modes and boot as if the user had pressed the power
+button.
+</p>
+<h2 id="bootloader-for-trusty">Bootloader for Trusty</h2>
+<p>
+<a href="/security/trusty/">Trusty</a> is Google's
+implementation of a Trusted Execution Environment (TEE) OS that runs alongside
+Android. This is the specification for devices using ARM Trustzone<sup>TM
+</sup>technology to provide a TEE.
+</p>
+<p>
+If Trusty is used as the secure OS solution on your ARM device, the bootloader
+should be implemented as described within the following sections.
+</p>
+<h3 id="initialization">Initialization</h3>
+<p>
+In order to load and initialize the Trusty OS (TOS), a bootloader should:
+</p><ul>
+<li>Set up and configure all available RAM
+<li>Initialize at least one serial port
+<li>Verify signature of TOS image
+<li>Load TOS into RAM (execution from flash or TCM is not supported)
+<li>Jump to the first instruction in the TOS image after setting up the state
+and registers as described in the next section</li></ul>
+<h3 id="calling-into-tos-image">Calling into TOS image</h3>
+<p>
+The following state should be configured at entry:
+</p><ul>
+<li>MMU turned off
+<li>Data cache flushed and turned off (instruction cache can be on or off)
+<li>All interrupts (IRQs and FIQs) disabled
+<li>CPU in SVC mode on ARM v7 and EL3 on ARM v8
+<li>Registers in the following state: <ul>
+ <li>r0/x0: size of memory allocated to TOS.
+ <li>r1/x1: physical address of a contiguous block of memory that contains
+platform-specific boot parameters. The layout of this block is platform
+specific.
+ <li>r2/x2: size of the above block of memory.
+ <li>r14/x30: return address to jump to (in non-secure mode) after TOS
+initializes.</li> </ul>
+</li> </ul>
+<p>
+<strong>Note: </strong>r0-r3/x0-x3 also serve as scratch registers to TOS. Do
+not expect their values to be preserved upon return.
+</p>
+<p>
+On a 64-bit platform:
+</p><ul>
+<li>Only w0-w2 are used for parameters, so x0-x2 should contain only 32-bit
+values.
+<li>x30 can contain a 64-bit value.
+<li>The value in x0 when added to the base address of TOS entry-point should
+result in a 32-bit value. The same applies to the size in register x2 when added
+to the address of boot parameter block in x1.</li></ul>
+<h3 id="return-from-tos">Return from TOS</h3>
+<p>
+TOS will return to the bootloader in non-secure mode (SCR.NS set to "1") when it
+is done initializing so that the bootloader may continue loading the primary
+operating system (e.g., Android).
+</p>
+</body>
+</html>
diff --git a/en/devices/camera/camera3.html b/en/devices/camera/camera3.html
index 7115722e..bb2e6833 100644
--- a/en/devices/camera/camera3.html
+++ b/en/devices/camera/camera3.html
@@ -22,57 +22,18 @@
-->
-
<p>
Android's camera Hardware Abstraction Layer (HAL) connects the higher level
camera framework APIs in
-<a href="http://developer.android.com/reference/android/hardware/Camera.html">android.hardware.Camera</a>
-to your underlying camera driver and hardware. Android 5.0 introduced a new,
-underlying implementation of the camera stack. If you have previously developed
-a camera HAL module and driver for older versions of Android, be aware of
-significant changes in the camera pipeline.</p>
-
-<p class="note"><strong>Note:</strong> The new camera HAL is in active
-development and can change at any time. This document describes the high-level
-design of the camera subsystem; for details, see
-<a href="/devices/camera/versioning.html">Camera Version Support</a>.</p>
-
-<h2 id="overview">Camera HAL1 overview</h2>
-
-<p>Version 1 of the camera subsystem was designed as a black box with high-level
-controls and the following three operating modes:</p>
-
-<ul>
-<li>Preview</li>
-<li>Video Record</li>
-<li>Still Capture</li>
-</ul>
-
-<p>Each mode has slightly different and overlapping capabilities. This made it
-hard to implement new types of features, such as burst mode, since it would fall
-between two of these modes.</p>
-
-<img src="images/camera_block.png" alt="Camera block diagram" id="figure1" />
-<p class="img-caption"><strong>Figure 1.</strong> Camera components</p>
-
-<p>Android 7.0 continues to support camera HAL1 as many devices still rely on
-it. In addition, the Android camera service supports implementing both HALs (1
-and 3), which is useful when you want to support a less-capable front-facing
-camera with camera HAL1 and a more advanced back-facing camera with camera
-HAL3.</p>
-
-<p class="note"><strong>Note:</strong> Camera HAL2 is not supported as it was a
-temporary step on the way to camera HAL3.</p>
-
-<p>There is a single camera HAL <em>module</em> (with its own
-<a href="/devices/camera/versioning.html#module_version">version
-number</a>), which lists multiple independent camera devices that each have
-their own version number. Camera module 2 or newer is required to support
-devices 2 or newer, and such camera modules can have a mix of camera device
-versions (this is what we mean when we say Android supports implementing both
-HALs).</p>
+<a href="https://developer.android.com/reference/android/hardware/camera2/package-summary">android.hardware.camera2</a>
+to your underlying camera driver and hardware. Android 8.0 introduced
+<a href="/devices/architecture/treble">Treble</a>, switching the CameraHal API
+to a stable interface defined by the HAL Interface Description Language (HIDL).
+If you have previously developed a camera
+HAL module and driver for older versions of Android, be aware of significant
+changes in the camera pipeline.</p>
-<h2 id="v3-enhance">Camera HAL3 enhancements</h2>
+<h2 id="v3-enhance">Camera HAL3 features</h2>
<p>The aim of the Android Camera API redesign is to substantially increase the
ability of applications to control the camera subsystem on Android devices while
@@ -125,5 +86,40 @@ have priority over repeating requests.</p>
<img src="images/camera_simple_model.png" alt="Camera data model" id="figure2" />
<p class="img-caption"><strong>Figure 2.</strong> Camera core operation model</p>
+<h2 id="overview">Camera HAL1 overview</h2>
+
+<aside class="note"><strong>Note:</strong> Camera HAL1 has been deprecated. New
+devices should use Camera HAL3.</aside>
+
+<p>Version 1 of the camera subsystem was designed as a black box with high-level
+controls and the following three operating modes:</p>
+
+<ul>
+<li>Preview</li>
+<li>Video Record</li>
+<li>Still Capture</li>
+</ul>
+
+<p>Each mode has slightly different and overlapping capabilities. This made it
+hard to implement new features such as burst mode, which falls between two of
+the operating modes.</p>
+
+<img src="images/camera_block.png" alt="Camera block diagram" id="figure1" />
+<p class="img-caption"><strong>Figure 1.</strong> Camera components</p>
+
+<p>Android 7.0 continues to support camera HAL1 as many devices still rely on
+it. In addition, the Android camera service supports implementing both HALs (1
+and 3), which is useful when you want to support a less-capable front-facing
+camera with camera HAL1 and a more advanced back-facing camera with camera
+HAL3.</p>
+
+<p>There is a single camera HAL <em>module</em> (with its own
+<a href="/devices/camera/versioning.html#module_version">version
+number</a>), which lists multiple independent camera devices that each have
+their own version number. Camera module 2 or newer is required to support
+devices 2 or newer, and such camera modules can have a mix of camera device
+versions (this is what we mean when we say Android supports implementing both
+HALs).</p>
+
</body>
</html>
diff --git a/en/devices/camera/camera3_3Amodes.html b/en/devices/camera/camera3_3Amodes.html
index 7849d398..ab3a5b59 100644
--- a/en/devices/camera/camera3_3Amodes.html
+++ b/en/devices/camera/camera3_3Amodes.html
@@ -24,240 +24,422 @@
<p>
- While the actual 3A algorithms are up to the HAL implementation, a high-level
- state machine description is defined by the HAL interface to allow the HAL
- device and the framework to communicate about the current state of 3A and
+ While the actual 3A algorithms are up to the HAL implementation, a high-level
+ state machine description is defined by the HAL interface to allow the HAL
+ device and the framework to communicate about the current state of 3A and
trigger 3A events.</p>
-<p>When the device is opened, all the individual 3A states must be STATE_INACTIVE.
- Stream configuration does not reset 3A. For example, locked focus must be
- maintained across the configure() call.</p>
-<p>Triggering a 3A action involves simply setting the relevant trigger entry in the
- settings for the next request to indicate start of trigger. For example, the
- trigger for starting an autofocus scan is setting the entry
- ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTROL_AF_TRIGGER_START for one request;
- and cancelling an autofocus scan is triggered by setting
- ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTRL_AF_TRIGGER_CANCEL. Otherwise, the
- entry will not exist or be set to ANDROID_CONTROL_AF_TRIGGER_IDLE. Each request
- with a trigger entry set to a non-IDLE value will be treated as an independent
- triggering event.</p>
-<p>At the top level, 3A is controlled by the ANDROID_CONTROL_MODE setting. It
- selects between no 3A (ANDROID_CONTROL_MODE_OFF), normal AUTO mode
- (ANDROID_CONTROL_MODE_AUTO), and using the scene mode setting
+<p>When the device is opened, all the individual 3A states must be
+ STATE_INACTIVE.
+ Stream configuration does not reset 3A. For example, locked focus must be
+ maintained across the <code>configure()</code> call.</p>
+<p>Triggering a 3A action involves simply setting the relevant trigger entry in
+ the settings for the next request to indicate start of trigger. For example,
+ the trigger for starting an autofocus scan is setting the entry
+ ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTROL_AF_TRIGGER_START for one request;
+ and cancelling an autofocus scan is triggered by setting
+ ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTRL_AF_TRIGGER_CANCEL. Otherwise, the
+ entry will not exist or be set to ANDROID_CONTROL_AF_TRIGGER_IDLE. Each
+ request with a trigger entry set to a non-IDLE value will be treated as an
+ independent triggering event.</p>
+<p>At the top level, 3A is controlled by the ANDROID_CONTROL_MODE setting. It
+ selects between no 3A (ANDROID_CONTROL_MODE_OFF), normal AUTO mode
+ (ANDROID_CONTROL_MODE_AUTO), and using the scene mode setting
(ANDROID_CONTROL_USE_SCENE_MODE):</p>
<ul>
- <li>In OFF mode, each of the individual Auto-focus(AF), auto-exposure (AE), and
- auto-whitebalance (AWB) modes are effectively OFF, and none of the capture
- controls may be overridden by the 3A routines.</li>
- <li>In AUTO mode, AF, AE, and AWB modes all run their own independent algorithms,
- and have their own mode, state, and trigger metadata entries, as listed in the
- next section.</li>
- <li>In USE_SCENE_MODE, the value of the ANDROID_CONTROL_SCENE_MODE entry must be
- used to determine the behavior of 3A routines. In SCENE_MODEs other than
- FACE_PRIORITY, the HAL must override the values of
- ANDROID_CONTROL_AE/AWB/AF_MODE to be the mode it prefers for the selected
- SCENE_MODE. For example, the HAL may prefer SCENE_MODE_NIGHT to use
- CONTINUOUS_FOCUS AF mode. Any user selection of AE/AWB/AF_MODE when scene must
- be ignored for these scene modes.</li>
- <li>For SCENE_MODE_FACE_PRIORITY, the AE/AWB/AFMODE controls work as in
- ANDROID_CONTROL_MODE_AUTO, but the 3A routines must bias toward metering and
+ <li>In OFF mode, each of the individual auto-focus (AF), auto-exposure (AE),
+ and auto-whitebalance (AWB) modes are effectively OFF, and none of the
+ capture controls may be overridden by the 3A routines.</li>
+ <li>In AUTO mode, AF, AE, and AWB modes all run their own independent
+ algorithms, and have their own mode, state, and trigger metadata entries,
+ as listed in the next section.</li>
+ <li>In USE_SCENE_MODE, the value of the ANDROID_CONTROL_SCENE_MODE entry must
+ be used to determine the behavior of 3A routines. In SCENE_MODEs other than
+ FACE_PRIORITY, the HAL must override the values of
+ ANDROID_CONTROL_AE/AWB/AF_MODE to be the mode it prefers for the selected
+ SCENE_MODE. For example, the HAL may prefer SCENE_MODE_NIGHT to use
+ CONTINUOUS_FOCUS AF mode. Any user selection of AE/AWB/AF_MODE when scene
+ must be ignored for these scene modes.</li>
+ <li>For SCENE_MODE_FACE_PRIORITY, the AE/AWB/AFMODE controls work as in
+ ANDROID_CONTROL_MODE_AUTO, but the 3A routines must bias toward metering and
focusing on any detected faces in the scene.</li>
</ul>
+
<h2 id="auto-focus">Auto-focus settings and result entries</h2>
-<p>Main metadata entries:<br/>
- ANDROID_CONTROL_AF_MODE: Control for selecting the current autofocus mode. Set
- by the framework in the request settings.<br/>
- AF_MODE_OFF: AF is disabled; the framework/app directly controls lens position.<br/>
- AF_MODE_AUTO: Single-sweep autofocus. No lens movement unless AF is triggered.<br/>
- AF_MODE_MACRO: Single-sweep up-close autofocus. No lens movement unless AF is
- triggered.<br/>
- AF_MODE_CONTINUOUS_VIDEO: Smooth continuous focusing, for recording video.
- Triggering immediately locks focus in current position. Canceling resumes
- continuous focusing.<br/>
- AF_MODE_CONTINUOUS_PICTURE: Fast continuous focusing, for zero-shutter-lag still
- capture. Triggering locks focus once currently active sweep concludes. Canceling
- resumes continuous focusing.<br/>
- AF_MODE_EDOF: Advanced extended depth of field focusing. There is no autofocus
- scan, so triggering one or canceling one has no effect. Images are focused
- automatically by the HAL.<br/>
- ANDROID_CONTROL_AF_STATE: Dynamic metadata describing the current AF algorithm
- state, reported by the HAL in the result metadata.<br/>
- AF_STATE_INACTIVE: No focusing has been done, or algorithm was reset. Lens is
- not moving. Always the state for MODE_OFF or MODE_EDOF. When the device is
- opened, it must start in this state.<br/>
- AF_STATE_PASSIVE_SCAN: A continuous focus algorithm is currently scanning for
- good focus. The lens is moving.<br/>
- AF_STATE_PASSIVE_FOCUSED: A continuous focus algorithm believes it is well
- focused. The lens is not moving. The HAL may spontaneously leave this state.<br/>
- AF_STATE_PASSIVE_UNFOCUSED: A continuous focus algorithm believes it is not well
- focused. The lens is not moving. The HAL may spontaneously leave this state.<br/>
- AF_STATE_ACTIVE_SCAN: A scan triggered by the user is underway.<br/>
- AF_STATE_FOCUSED_LOCKED: The AF algorithm believes it is focused. The lens is
- not moving.<br/>
- AF_STATE_NOT_FOCUSED_LOCKED: The AF algorithm has been unable to focus. The lens
- is not moving.<br/>
- ANDROID_CONTROL_AFTRIGGER: Control for starting an autofocus scan, the meaning
- of which depends on mode and state. Set by the framework in the request
- settings.<br/>
- AF_TRIGGER_IDLE: No current trigger.<br/>
- AF_TRIGGER_START: Trigger start of AF scan. Effect depends on mode and state.<br/>
- AF_TRIGGER_CANCEL: Cancel current AF scan if any, and reset algorithm to
- default.<br/>
- Additional metadata entries:<br/>
- ANDROID_CONTROL_AF_REGIONS: Control for selecting the regions of the field of
- view (FOV) that should be used to determine good focus. This applies to all AF
- modes that scan for focus. Set by the framework in the request settings.</p>
+
+<table>
+ <tr>
+ <th colspan="2">Main metadata entries</th>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AF_MODE</td>
+ <td>Control for selecting the current autofocus mode. Set by the framework
+ in the request settings.</td>
+ </tr>
+ <tr>
+ <td>AF_MODE_OFF</td>
+ <td>AF is disabled; the framework/app directly controls lens position.</td>
+ </tr>
+ <tr>
+ <td>AF_MODE_AUTO</td>
+ <td>Single-sweep autofocus. No lens movement unless AF is triggered.</td>
+ </tr>
+ <tr>
+ <td>AF_MODE_MACRO</td>
+ <td>Single-sweep up-close autofocus. No lens movement unless AF is triggered</td>
+ </tr>
+ <tr>
+ <td>AF_MODE_CONTINUOUS_VIDEO</td>
+ <td>Smooth continuous focusing, for recording video. Triggering immediately
+ locks focus in current position. Canceling resumes continuous focusing.</td>
+ </tr>
+ <tr>
+ <td>AF_MODE_CONTINUOUS_PICTURE</td>
+ <td>Fast continuous focusing, for zero-shutter-lag still capture. Triggering
+ locks focus once currently active sweep concludes. Canceling resumes
+ continuous focusing.</td>
+ </tr>
+ <tr>
+ <td>AF_MODE_EDOF</td>
+ <td>Advanced extended depth of field focusing. There is no autofocus scan,
+ so triggering one or canceling one has no effect. Images are focused
+ automatically by the HAL.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AF_STATE</td>
+ <td>Dynamic metadata describing the current AF algorithm state, reported
+ by the HAL in the result metadata.</td>
+ </tr>
+ <tr>
+ <td>AF_STATE_INACTIVE</td>
+ <td>No focusing has been done, or algorithm was reset. Lens is not moving.
+ Always the state for MODE_OFF or MODE_EDOF. When the device is opened,
+ it must start in this state.</td>
+ </tr>
+ <tr>
+ <td>AF_STATE_PASSIVE_SCAN</td>
+ <td>A continuous focus algorithm is currently scanning for good focus.
+ The lens is moving.</td>
+ </tr>
+ <tr>
+ <td>AF_STATE_PASSIVE_FOCUSED</td>
+ <td>A continuous focus algorithm believes it is well focused. The lens
+ is not moving. The HAL may spontaneously leave this state.</td>
+ </tr>
+ <tr>
+ <td>AF_STATE_PASSIVE_UNFOCUSED</td>
+ <td>A continuous focus algorithm believes it is not well focused. The lens
+ is not moving. The HAL may spontaneously leave this state.</td>
+ </tr>
+ <tr>
+ <td>AF_STATE_ACTIVE_SCAN</td>
+ <td>A scan triggered by the user is underway.</td>
+ </tr>
+ <tr>
+ <td>AF_STATE_FOCUSED_LOCKED</td>
+ <td>The AF algorithm believes it is focused. The lens is not moving.</td>
+ </tr>
+ <tr>
+ <td>AF_STATE_NOT_FOCUSED_LOCKED</td>
+ <td>The AF algorithm has been unable to focus. The lens is not moving.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AFTRIGGER</td>
+ <td>Control for starting an autofocus scan, the meaning of which depends on
+ mode and state. Set by the framework in the request settings.</td>
+ </tr>
+ <tr>
+ <td>AF_TRIGGER_IDLE</td>
+ <td>No current trigger.</td>
+ </tr>
+ <tr>
+ <td>AF_TRIGGER_START</td>
+ <td>Trigger start of AF scan. Effect depends on mode and state.</td>
+ </tr>
+ <tr>
+ <td>AF_TRIGGER_CANCEL</td>
+ <td>Cancel current AF scan if any, and reset algorithm to default.</td>
+ </tr>
+</table>
+
+<table>
+ <tr>
+ <th colspan="2">Additional metadata entries</th>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AF_REGIONS</td>
+ <td>Control for selecting the regions of the field of view (FOV) that should
+ be used to determine good focus. This applies to all AF
+ modes that scan for focus. Set by the framework in the request settings.</td>
+ </tr>
+</table>
+
<h2 id="auto-exposure">Auto-exposure settings and result entries</h2>
-<p>Main metadata entries:<br/>
- ANDROID_CONTROL_AE_MODE: Control for selecting the current auto-exposure mode.
- Set by the framework in the request settings.<br/>
- AE_MODE_OFF: Autoexposure is disabled; the user controls exposure, gain, frame
- duration, and flash.<br/>
- AE_MODE_ON: Standard autoexposure, with flash control disabled. User may set
- flash to fire or to torch mode.<br/>
- AE_MODE_ON_AUTO_FLASH: Standard autoexposure, with flash on at HAL's discretion
- for precapture and still capture. User control of flash disabled.<br/>
- AE_MODE_ON_ALWAYS_FLASH: Standard autoexposure, with flash always fired for
- capture, and at HAL's discretion for precapture. User control of flash disabled.<br/>
- AE_MODE_ON_AUTO_FLASH_REDEYE: Standard autoexposure, with flash on at HAL's
- discretion for precapture and still capture. Use a flash burst at end of
- precapture sequence to reduce redeye in the final picture. User control of flash
- disabled.<br/>
- ANDROID_CONTROL_AE_STATE: Dynamic metadata describing the current AE algorithm
- state, reported by the HAL in the result metadata.<br/>
- AE_STATE_INACTIVE: Initial AE state after mode switch. When the device is
- opened, it must start in this state.<br/>
- AE_STATE_SEARCHING: AE is not converged to a good value and is adjusting
- exposure parameters.<br/>
- AE_STATE_CONVERGED: AE has found good exposure values for the current scene, and
- the exposure parameters are not changing. HAL may spontaneously leave this state
- to search for a better solution.<br/>
- AE_STATE_LOCKED: AE has been locked with the AE_LOCK control. Exposure values
- are not changing.<br/>
- AE_STATE_FLASH_REQUIRED: The HAL has converged exposure but believes flash is
- required for a sufficiently bright picture. Used for determining if a
- zero-shutter-lag frame can be used.<br/>
- AE_STATE_PRECAPTURE: The HAL is in the middle of a precapture sequence.
- Depending on AE mode, this mode may involve firing the flash for metering or a
- burst of flash pulses for redeye reduction.<br/>
- ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER: Control for starting a metering sequence
- before capturing a high-quality image. Set by the framework in the request
- settings.<br/>
- PRECAPTURE_TRIGGER_IDLE: No current trigger.<br/>
- PRECAPTURE_TRIGGER_START: Start a precapture sequence. The HAL should use the
- subsequent requests to measure good exposure/white balance for an upcoming
- high-resolution capture.<br/>
- Additional metadata entries:<br/>
- ANDROID_CONTROL_AE_LOCK: Control for locking AE controls to their current
- values.<br/>
- ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION: Control for adjusting AE algorithm
- target brightness point.<br/>
- ANDROID_CONTROL_AE_TARGET_FPS_RANGE: Control for selecting the target frame rate
- range for the AE algorithm. The AE routine cannot change the frame rate to be
- outside these bounds.<br/>
- ANDROID_CONTROL_AE_REGIONS: Control for selecting the regions of the FOV that
- should be used to determine good exposure levels. This applies to all AE modes
- besides OFF.</p>
+
+<table>
+ <th colspan="2">Main metadata entries</th>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AE_MODE</td>
+ <td>Control for selecting the current auto-exposure mode. Set by the
+ framework in the request settings.</td>
+ </tr>
+ <tr>
+ <td>AE_MODE_OFF</td>
+ <td>Autoexposure is disabled; the user controls exposure, gain, frame
+ duration, and flash.</td>
+ </tr>
+ <tr>
+ <td>AE_MODE_ON</td>
+ <td>Standard autoexposure, with flash control disabled. User may set flash
+ to fire or to torch mode.</td>
+ </tr>
+ <tr>
+ <td>AE_MODE_ON_AUTO_FLASH</td>
+ <td>Standard autoexposure, with flash on at HAL's discretion for precapture
+ and still capture. User control of flash disabled.</td>
+ </tr>
+ <tr>
+ <td>AE_MODE_ON_ALWAYS_FLASH</td>
+ <td>Standard autoexposure, with flash always fired for capture, and at HAL's
+ discretion for precapture. User control of flash disabled.</td>
+ </tr>
+ <tr>
+ <td>AE_MODE_ON_AUTO_FLASH_REDEYE</td>
+ <td>Standard autoexposure, with flash on at HAL's discretion for precapture
+ and still capture. Use a flash burst at end of precapture sequence to
+ reduce redeye in the final picture. User control of flash disabled.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AE_STATE</td>
+ <td>Dynamic metadata describing the current AE algorithm state, reported by
+ the HAL in the result metadata.</td>
+ </tr>
+ <tr>
+ <td>AE_STATE_INACTIVE</td>
+ <td>Initial AE state after mode switch. When the device is opened, it must
+ start in this state.</td>
+ </tr>
+ <tr>
+ <td>AE_STATE_SEARCHING</td>
+ <td>AE is not converged to a good value and is adjusting exposure
+ parameters.</td>
+ </tr>
+ <tr>
+ <td>AE_STATE_CONVERGED</td>
+ <td>AE has found good exposure values for the current scene, and the
+ exposure parameters are not changing. HAL may spontaneously leave this
+ state to search for a better solution.</td>
+ </tr>
+ <tr>
+ <td>AE_STATE_LOCKED</td>
+ <td>AE has been locked with the AE_LOCK control. Exposure values are not
+ changing.</td>
+ </tr>
+ <tr>
+ <td>AE_STATE_FLASH_REQUIRED</td>
+ <td>The HAL has converged exposure but believes flash is required for a
+ sufficiently bright picture. Used for determining if a zero-shutter-lag
+ frame can be used.</td>
+ </tr>
+ <tr>
+ <td>AE_STATE_PRECAPTURE</td>
+ <td>The HAL is in the middle of a precapture sequence. Depending on AE mode,
+ this mode may involve firing the flash for metering or a burst of flash
+ pulses for redeye reduction.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER</td>
+ <td>Control for starting a metering sequence before capturing a high-quality
+ image. Set by the framework in the request settings.</td>
+ </tr>
+ <tr>
+ <td>PRECAPTURE_TRIGGER_IDLE</td>
+ <td>No current trigger.</td>
+ </tr>
+ <tr>
+ <td>PRECAPTURE_TRIGGER_START</td>
+ <td>Start a precapture sequence. The HAL should use the subsequent requests
+ to measure good exposure/white balance for an upcoming high-resolution
+ capture.</td>
+ </tr>
+</table>
+
+<table>
+ <th colspan="2">Additional metadata entries</th>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AE_LOCK</td>
+ <td>Control for locking AE controls to their current values.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION</td>
+ <td>Control for adjusting AE algorithm target brightness point.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AE_TARGET_FPS_RANGE</td>
+ <td>Control for selecting the target frame rate range for the AE algorithm.
+ The AE routine cannot change the frame rate to be outside these
+ bounds.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AE_REGIONS</td>
+ <td>Control for selecting the regions of the FOV that should be used to
+ determine good exposure levels. This applies to all AE modes
+ besides OFF.</td>
+ </tr>
+</table>
+
<h2 id="auto-wb">Auto-whitebalance settings and result entries</h2>
-<p>Main metadata entries:<br/>
- ANDROID_CONTROL_AWB_MODE: Control for selecting the current white-balance mode.<br/>
- AWB_MODE_OFF: Auto-whitebalance is disabled. User controls color matrix.<br/>
- AWB_MODE_AUTO: Automatic white balance is enabled; 3A controls color transform,
- possibly using more complex transforms than a simple matrix.<br/>
- AWB_MODE_INCANDESCENT: Fixed white balance settings good for indoor incandescent
- (tungsten) lighting, roughly 2700K.<br/>
- AWB_MODE_FLUORESCENT: Fixed white balance settings good for fluorescent
- lighting, roughly 5000K.<br/>
- AWB_MODE_WARM_FLUORESCENT: Fixed white balance settings good for fluorescent
- lighting, roughly 3000K.<br/>
- AWB_MODE_DAYLIGHT: Fixed white balance settings good for daylight, roughly
- 5500K.<br/>
- AWB_MODE_CLOUDY_DAYLIGHT: Fixed white balance settings good for clouded
- daylight, roughly 6500K.<br/>
- AWB_MODE_TWILIGHT: Fixed white balance settings good for near-sunset/sunrise,
- roughly 15000K.<br/>
- AWB_MODE_SHADE: Fixed white balance settings good for areas indirectly lit by
- the sun, roughly 7500K.<br/>
- ANDROID_CONTROL_AWB_STATE: Dynamic metadata describing the current AWB algorithm
- state, reported by the HAL in the result metadata.<br/>
- AWB_STATE_INACTIVE: Initial AWB state after mode switch. When the device is
- opened, it must start in this state.<br/>
- AWB_STATE_SEARCHING: AWB is not converged to a good value and is changing color
- adjustment parameters.<br/>
- AWB_STATE_CONVERGED: AWB has found good color adjustment values for the current
- scene, and the parameters are not changing. HAL may spontaneously leave this
- state to search for a better solution.<br/>
- AWB_STATE_LOCKED: AWB has been locked with the AWB_LOCK control. Color
- adjustment values are not changing.<br/>
- Additional metadata entries:<br/>
- ANDROID_CONTROL_AWB_LOCK: Control for locking AWB color adjustments to their
- current values.<br/>
- ANDROID_CONTROL_AWB_REGIONS: Control for selecting the regions of the FOV that
- should be used to determine good color balance. This applies only to
- auto-whitebalance mode.</p>
-<h2 id="state-transition">General state machine transition notes</h2>
-<p>Switching between AF, AE, or AWB modes always resets the algorithm's state to
- INACTIVE. Similarly, switching between CONTROL_MODE or CONTROL_SCENE_MODE if
- CONTROL_MODE == USE_SCENE_MODE resets all the algorithm states to INACTIVE.<br/>
- The tables below are per-mode.</p>
-<h2 id="af-state">AF state machines</h2>
+
<table>
+ <th colspan="2">Main metadata entries</th>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AWB_MODE</td>
+ <td>Control for selecting the current white-balance mode.</td>
+ </tr>
<tr>
- <td><strong>mode = AF_MODE_OFF or AF_MODE_EDOF</strong></td>
- <td></td>
- <td></td>
- <td></td>
+ <td>AWB_MODE_OFF</td>
+ <td>Auto-whitebalance is disabled. User controls color matrix.</td>
</tr>
<tr>
- <th>State</th>
- <th>Transformation cause</th>
- <th>New state</th>
- <th>Notes</th>
+ <td>AWB_MODE_AUTO</td>
+ <td>Automatic white balance is enabled; 3A controls color transform,
+ possibly using more complex transforms than a simple matrix.</td>
</tr>
<tr>
- <td>INACTIVE</td>
- <td></td>
- <td></td>
- <td>AF is disabled</td>
+ <td>AWB_MODE_INCANDESCENT</td>
+ <td>Fixed white balance settings good for indoor incandescent (tungsten)
+ lighting, roughly 2700K.</td>
</tr>
<tr>
- <td><strong>mode = AF_MODE_AUTO or AF_MODE_MACRO</strong></td>
- <td></td>
+ <td>AWB_MODE_FLUORESCENT</td>
+ <td>Fixed white balance settings good for fluorescent lighting, roughly
+ 5000K.</td>
+ </tr>
+ <tr>
+ <td>AWB_MODE_WARM_FLUORESCENT</td>
+ <td>Fixed white balance settings good for fluorescent lighting, roughly
+ 3000K.</td>
+ </tr>
+ <tr>
+ <td>AWB_MODE_DAYLIGHT</td>
+ <td>Fixed white balance settings good for daylight, roughly 5500K.</td>
+ </tr>
+ <tr>
+ <td>AWB_MODE_CLOUDY_DAYLIGHT</td>
+ <td>Fixed white balance settings good for clouded daylight, roughly 6500K.</td>
+ </tr>
+ <tr>
+ <td>AWB_MODE_TWILIGHT</td>
+ <td>Fixed white balance settings good for near-sunset/sunrise, roughly
+ 15000K.</td>
+ </tr>
+ <tr>
+ <td>AWB_MODE_SHADE</td>
+ <td>Fixed white balance settings good for areas indirectly lit by the sun,
+ roughly 7500K.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AWB_STATE</td>
+ <td>Dynamic metadata describing the current AWB algorithm state, reported by
+ the HAL in the result metadata.</td>
+ </tr>
+ <tr>
+ <td>AWB_STATE_INACTIVE</td>
+ <td>Initial AWB state after mode switch. When the device is opened, it must
+ start in this state.</td>
+ </tr>
+ <tr>
+ <td>AWB_STATE_SEARCHING</td>
+ <td>AWB is not converged to a good value and is changing color adjustment
+ parameters.</td>
+ </tr>
+ <tr>
+ <td>AWB_STATE_CONVERGED</td>
+ <td>AWB has found good color adjustment values for the current scene, and
+ the parameters are not changing. HAL may spontaneously leave this state
+ to search for a better solution.</td>
+ </tr>
+ <tr>
+ <td>AWB_STATE_LOCKED</td>
+ <td>AWB has been locked with the AWB_LOCK control. Color adjustment
+ values are not changing.</td>
+ </tr>
+</table>
+
+<table>
+ <th colspan="2">Additional metadata entries</th>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AWB_LOCK</td>
+ <td>Control for locking AWB color adjustments to their current values.</td>
+ </tr>
+ <tr class="alt">
+ <td>ANDROID_CONTROL_AWB_REGIONS</td>
+ <td>Control for selecting the regions of the FOV that should be used to
+ determine good color balance. This applies only to auto-whitebalance
+ mode.</td>
+ </tr>
+</table>
+
+<h2 id="state-transition">General state machine transition notes</h2>
+<p>Switching between AF, AE, or AWB modes always resets the algorithm's state to
+ INACTIVE. Similarly, switching between CONTROL_MODE or CONTROL_SCENE_MODE if
+ CONTROL_MODE == USE_SCENE_MODE resets all the algorithm states to INACTIVE.</p>
+<p>The tables below are per-mode.</p>
+
+<h2 id="af-state">AF state machines</h2>
+
+<table>
+ <tr>
+ <th colspan="4">mode = AF_MODE_OFF or AF_MODE_EDOF</th></tr>
+ <tr class="alt">
+ <td>State</td>
+ <td>Transformation cause</td>
+ <td>New state</td>
+ <td>Notes</td>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
<td></td>
<td></td>
+ <td>AF is disabled</td>
</tr>
+</table>
+
+<table>
<tr>
- <th>State</th>
- <th>Transformation cause</th>
- <th>New state</th>
- <th>Notes</th>
+ <th colspan="4">mode = AF_MODE_AUTO or AF_MODE_MACRO</th></tr>
+ <tr class="alt">
+ <td>State</td>
+ <td>Transformation cause</td>
+ <td>New state</td>
+ <td>Notes</td>
</tr>
<tr>
<td>INACTIVE</td>
<td>AF_TRIGGER</td>
<td>ACTIVE_SCAN</td>
- <td>Start AF sweep
- Lens now moving</td>
+ <td><p>Start AF sweep</p>
+ <p>Lens now moving</p></td>
</tr>
<tr>
<td>ACTIVE_SCAN</td>
<td>AF sweep done</td>
<td>FOCUSED_LOCKED</td>
- <td>If AF successful
- Lens now locked</td>
+ <td><p>If AF successful</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>ACTIVE_SCAN</td>
<td>AF sweep done</td>
<td>NOT_FOCUSED_LOCKED</td>
- <td>If AF successful
- Lens now locked</td>
+ <td><p>If AF successful</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>ACTIVE_SCAN</td>
<td>AF_CANCEL</td>
<td>INACTIVE</td>
- <td>Cancel/reset AF
- Lens now locked</td>
+ <td><p>Cancel/reset AF</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>FOCUSED_LOCKED</td>
@@ -269,8 +451,8 @@
<td>FOCUSED_LOCKED</td>
<td>AF_TRIGGER</td>
<td>ACTIVE_SCAN</td>
- <td>Start new sweep
- Lens now moving</td>
+ <td><p>Start new sweep</p>
+ <p>Lens now moving</p></td>
</tr>
<tr>
<td>NOT_FOCUSED_LOCKED</td>
@@ -282,85 +464,84 @@
<td>NOT_FOCUSED_LOCKED</td>
<td>AF_TRIGGER</td>
<td>ACTIVE_SCAN</td>
- <td>Start new sweep
- Lens now moving</td>
+ <td><p>Start new sweep</p>
+ <p>Lens now moving</p></td>
</tr>
<tr>
<td>All states</td>
- <td>mode change</td>
+ <td>Mode change</td>
<td>INACTIVE</td>
<td></td>
</tr>
+</table>
+
+<table>
<tr>
- <td><strong>mode = AF_MODE_CONTINUOUS_VIDEO</strong></td>
- <td></td>
- <td></td>
- <td></td>
- </tr>
- <tr>
- <th>State</th>
- <th>Transformation cause</th>
- <th>New state</th>
- <th>Notes</th>
+ <th colspan="4">mode = AF_MODE_CONTINUOUS_VIDEO</th></tr>
+ <tr class="alt">
+ <td>State</td>
+ <td>Transformation cause</td>
+ <td>New state</td>
+ <td>Notes</td>
</tr>
<tr>
<td>INACTIVE</td>
<td>HAL initiates new scan</td>
<td>PASSIVE_SCAN</td>
- <td>Start AF sweep
- Lens now moving</td>
+ <td><p>Start AF sweep</p>
+ <p>Lens now moving</p></td>
</tr>
<tr>
<td>INACTIVE</td>
<td>AF_TRIGGER</td>
<td>NOT_FOCUSED_LOCKED</td>
- <td>AF state query
- Lens now locked</td>
+ <td><p>AF state query</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>HAL completes current scan</td>
<td>PASSIVE_FOCUSED</td>
- <td>End AF scan
- Lens now locked </td>
+ <td><p>End AF scan</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>AF_TRIGGER</td>
<td>FOCUSED_LOCKED</td>
- <td>Immediate transformation
- if focus is good
- Lens now locked</td>
+ <td><p>Immediate transformation
+ if focus is good</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>AF_TRIGGER</td>
<td>NOT_FOCUSED_LOCKED</td>
- <td>Immediate transformation
- if focus is bad
- Lens now locked</td>
+ <td><p>Immediate transformation
+ if focus is bad</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>AF_CANCEL</td>
<td>INACTIVE</td>
- <td>Reset lens position
- Lens now locked</td>
+ <td><p>Reset lens position</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_FOCUSED</td>
<td>HAL initiates new scan</td>
<td>PASSIVE_SCAN</td>
- <td>Start AF scan
- Lens now moving</td>
+ <td><p>Start AF scan</p>
+ <p>Lens now moving</p></td>
</tr>
<tr>
<td>PASSIVE_FOCUSED</td>
<td>AF_TRIGGER</td>
<td>FOCUSED_LOCKED</td>
- <td>Immediate transformation
- if focus is good
- Lens now locked</td>
+ <td><p>Immediate transformation
+ if focus is good</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_FOCUSED</td>
@@ -368,7 +549,7 @@
<td>NOT_FOCUSED_LOCKED</td>
<td>Immediate transformation
if focus is bad
- Lens now locked</td>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>FOCUSED_LOCKED</td>
@@ -394,80 +575,79 @@
<td>INACTIVE</td>
<td>Restart AF scan</td>
</tr>
+</table>
+
+<table>
<tr>
- <td><strong>mode = AF_MODE_CONTINUOUS_PICTURE</strong></td>
- <td></td>
- <td></td>
- <td></td>
- </tr>
- <tr>
- <th>State</th>
- <th>Transformation cause</th>
- <th>New state</th>
- <th>Notes</th>
+ <th colspan="4">mode = AF_MODE_CONTINUOUS_PICTURE</th></tr>
+ <tr class="alt">
+ <td>State</td>
+ <td>Transformation cause</td>
+ <td>New state</td>
+ <td>Notes</td>
</tr>
<tr>
<td>INACTIVE</td>
<td>HAL initiates new scan</td>
<td>PASSIVE_SCAN</td>
- <td>Start AF scan
- Lens now moving</td>
+ <td><p>Start AF scan</p>
+ <p>Lens now moving</p></td>
</tr>
<tr>
<td>INACTIVE</td>
<td>AF_TRIGGER</td>
<td>NOT_FOCUSED_LOCKED</td>
- <td>AF state query
- Lens now locked</td>
+ <td><p>AF state query</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>HAL completes current scan</td>
<td>PASSIVE_FOCUSED</td>
<td>End AF scan
- Lens now locked</td>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>AF_TRIGGER</td>
<td>FOCUSED_LOCKED</td>
- <td>Eventual transformation once focus good
- Lens now locked</td>
+ <td><p>Eventual transformation once focus good</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>AF_TRIGGER</td>
<td>NOT_FOCUSED_LOCKED</td>
- <td>Eventual transformation if cannot focus
- Lens now locked</td>
+ <td><p>Eventual transformation if cannot focus</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_SCAN</td>
<td>AF_CANCEL</td>
<td>INACTIVE</td>
- <td>Reset lens position
- Lens now locked</td>
+ <td><p>Reset lens position</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_FOCUSED</td>
<td>HAL initiates new scan</td>
<td>PASSIVE_SCAN</td>
- <td>Start AF scan
- Lens now moving</td>
+ <td><p>Start AF scan</p>
+ <p>Lens now moving</p></td>
</tr>
<tr>
<td>PASSIVE_FOCUSED</td>
<td>AF_TRIGGER</td>
<td>FOCUSED_LOCKED</td>
- <td>Immediate transformation if focus is good
- Lens now locked</td>
+ <td><p>Immediate transformation if focus is good</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>PASSIVE_FOCUSED</td>
<td>AF_TRIGGER</td>
<td>NOT_FOCUSED_LOCKED</td>
- <td>Immediate transformation if focus is bad
- Lens now locked</td>
+ <td><p>Immediate transformation if focus is bad</p>
+ <p>Lens now locked</p></td>
</tr>
<tr>
<td>FOCUSED_LOCKED</td>
@@ -494,22 +674,21 @@
<td>Restart AF scan</td>
</tr>
</table>
+
<h2 id="ae-wb">AE and AWB state machines</h2>
-<p>The AE and AWB state machines are mostly identical. AE has additional
- FLASH_REQUIRED and PRECAPTURE states. So rows below that refer to those two
+<p>The AE and AWB state machines are mostly identical. AE has additional
+ FLASH_REQUIRED and PRECAPTURE states. So rows below that refer to those two
states should be ignored for the AWB state machine.</p>
+
+
<table>
<tr>
- <td><strong>mode = AE_MODE_OFF / AWB mode not AUTO</strong></td>
- <td></td>
- <td></td>
- <td></td>
- </tr>
- <tr>
- <th>State</th>
- <th>Transformation cause</th>
- <th>New state</th>
- <th>Notes</th>
+ <th colspan="4">mode = AE_MODE_OFF / AWB mode not AUTO</th></tr>
+ <tr class="alt">
+ <td>State</td>
+ <td>Transformation cause</td>
+ <td>New state</td>
+ <td>Notes</td>
</tr>
<tr>
<td>INACTIVE</td>
@@ -517,17 +696,16 @@
<td></td>
<td>AE/AWB disabled</td>
</tr>
+</table>
+
+<table>
<tr>
- <td><strong>mode = AE_MODE_ON_* / AWB_MODE_AUTO</strong></td>
- <td></td>
- <td></td>
- <td></td>
- </tr>
- <tr>
- <th>State</th>
- <th>Transformation cause</th>
- <th>New state</th>
- <th>Notes</th>
+ <th colspan="4">mode = AE_MODE_ON_* / AWB_MODE_AUTO</th></tr>
+ <tr class="alt">
+ <td>State</td>
+ <td>Transformation cause</td>
+ <td>New state</td>
+ <td>Notes</td>
</tr>
<tr>
<td>INACTIVE</td>
@@ -620,16 +798,19 @@
<td>Ready for high-quality capture</td>
</tr>
</table>
+
<h2 id="manual-control">Enabling manual control</h2>
-<p>Several controls are also involved in configuring the device 3A blocks to allow
- for direct application control.</p>
-<p>The HAL model for 3A control is that for each request, the HAL inspects the
- state of the 3A control fields. If any 3A routine is enabled, then that routine
- overrides the control variables that relate to that routine, and these override
- values are then available in the result metadata for that capture. So for
- example, if auto-exposure is enabled in a request, the HAL should overwrite the
- exposure, gain, and frame duration fields (and potentially the flash fields,
- depending on AE mode) of the request. The list of relevant controls is:</p>
+<p>Several controls are also involved in configuring the device 3A blocks to
+ allow for direct application control.</p>
+<p>The HAL model for 3A control is that for each request, the HAL inspects the
+ state of the 3A control fields. If any 3A routine is enabled, then that
+ routine overrides the control variables that relate to that routine, and
+ these override values are then available in the result metadata for that
+ capture. So for example, if auto-exposure is enabled in a request, the HAL
+ should overwrite the exposure, gain, and frame duration fields (and
+ potentially the flash fields, depending on AE mode) of the request. The
+ list of relevant controls is:</p>
+
<table>
<tr>
<th>Control name</th>
@@ -639,24 +820,30 @@
<tr>
<td>android.control.mode</td>
<td>enum: OFF, AUTO, USE_SCENE_MODE</td>
- <td>High-level 3A control. When set to OFF, all 3A control by the HAL is disabled. The application must set the fields for capture parameters itself.
- When set to AUTO, the individual algorithm controls in android.control.* are in effect, such as android.control.afMode.
- When set to USE_SCENE_MODE, the individual controls in android.control.* are mostly disabled, and the HAL implements one of the scene mode settings (such as ACTION, SUNSET, or PARTY) as it wishes.</td>
+ <td>High-level 3A control. When set to OFF, all 3A control by the HAL is
+ disabled. The application must set the fields for capture parameters
+ itself. When set to AUTO, the individual algorithm controls in
+ android.control.* are in effect, such as android.control.afMode.
+ When set to USE_SCENE_MODE, the individual controls in android.control.*
+ are mostly disabled, and the HAL implements one of the scene mode
+ settings (such as ACTION, SUNSET, or PARTY) as it wishes.</td>
</tr>
<tr>
<td>android.control.afMode</td>
<td>enum</td>
- <td>OFF means manual control of lens focusing through android.lens.focusDistance.</td>
+ <td>OFF means manual control of lens
+ focusing through android.lens.focusDistance.</td>
</tr>
<tr>
<td>android.control.aeMode</td>
<td>enum</td>
- <td>OFF means manual control of exposure/gain/frame duration through android.sensor.exposureTime / .sensitivity / .frameDuration</td>
+ <td>OFF means manual control of exposure/gain/frame duration through
+ android.sensor.exposureTime / .sensitivity / .frameDuration</td>
</tr>
<tr>
<td>android.control.awbMode</td>
<td>enum</td>
- <td>OFF means manual control of white balance. </td>
+ <td>OFF means manual control of white balance.</td>
</tr>
</table>
diff --git a/en/devices/camera/camera3_crop_reprocess.html b/en/devices/camera/camera3_crop_reprocess.html
index 1e24db3b..2fdb58dc 100644
--- a/en/devices/camera/camera3_crop_reprocess.html
+++ b/en/devices/camera/camera3_crop_reprocess.html
@@ -24,30 +24,21 @@
<h2 id="output-stream">Output streams</h2>
-<p> Unlike the old camera subsystem, which has 3-4 different ways of producing data
- from the camera (ANativeWindow-based preview operations, preview callbacks,
- video callbacks, and takePicture callbacks), the new subsystem operates solely
- on the ANativeWindow-based pipeline for all resolutions and output formats.
- Multiple such streams can be configured at once, to send a single frame to many
- targets such as the GPU, the video encoder, RenderScript, or app-visible buffers
- (RAW Bayer, processed YUV buffers, or JPEG-encoded buffers).</p>
+<p>The camera subsystem operates solely on the ANativeWindow-based pipeline for
+ all resolutions and output formats. Multiple streams can be configured at
+ one time to send a single frame to many targets such as the GPU, the video
+ encoder,
+ <a href="/devices/architecture/vndk/renderscript">RenderScript</a>,
+ or app-visible buffers (RAW Bayer, processed YUV
+ buffers, or JPEG-encoded buffers).</p>
<p>As an optimization, these output streams must be configured ahead of time, and
only a limited number may exist at once. This allows for pre-allocation of
memory buffers and configuration of the camera hardware, so that when requests
are submitted with multiple or varying output pipelines listed, there won't be
delays or latency in fulfilling the request.</p>
-<p>To support backwards compatibility with the current camera API, at least 3
- simultaneous YUV output streams must be supported, plus one JPEG stream. This is
- required for video snapshot support with the application also receiving YUV
- buffers:</p>
-<ul>
- <li>One stream to the GPU/SurfaceView (opaque YUV format) for preview</li>
- <li>One stream to the video encoder (opaque YUV format) for recording</li>
- <li>One stream to the application (known YUV format) for preview frame callbacks</li>
- <li>One stream to the application (JPEG) for video snapshots.</li>
-</ul>
-<p>The exact requirements are still being defined since the corresponding API
-isn't yet finalized.</p>
+<p>For further information about the guaranteed stream output combinations
+ that depend on the supported hardware level, see
+ <code><a href="https://developer.android.com/reference/android/hardware/camera2/CameraDevice#createCaptureSession(java.util.List%3Candroid.view.Surface%3E,%20android.hardware.camera2.CameraCaptureSession.StateCallback,%20android.os.Handler)">createCaptureSession()</a></code>.</p>
<h2>Cropping</h2>
<p>Cropping of the full pixel array (for digital zoom and other use cases where a
smaller FOV is desirable) is communicated through the ANDROID_SCALER_CROP_REGION
diff --git a/en/devices/camera/camera3_error_stream.html b/en/devices/camera/camera3_error_stream.html
index aebb325c..f1911198 100644
--- a/en/devices/camera/camera3_error_stream.html
+++ b/en/devices/camera/camera3_error_stream.html
@@ -24,139 +24,31 @@
<h2 id="error-mgmt">Error management</h2>
-<p>Camera HAL device ops functions that have a return value will all return <code>-ENODEV
- / NULL</code> in case of a serious error. This means the device cannot continue
- operation and must be closed by the framework. Once this error is returned by
- some method, or if <code>notify()</code> is called with <code>ERROR_DEVICE</code>, only the <code>close()</code> method
- can be called successfully. All other methods will return <code>-ENODEV / NULL</code>.</p>
-<p>If a device op is called in the wrong sequence, for example if the framework
-calls <code>configure_streams()</code> before <code>initialize()</code>, the device must return
-<code>-ENOSYS</code> from the call, and do nothing.</p>
-<p>Transient errors in image capture must be reported through <code>notify()</code> as follows:</p>
-<ul>
- <li>The failure of an entire capture to occur must be reported by the HAL by
- calling <code>notify()</codE> with <code>ERROR_REQUEST</code>. Individual errors for the result metadata
- or the output buffers must not be reported in this case.</li>
- <li>If the metadata for a capture cannot be produced, but some image buffers were
- filled, the HAL must call <code>notify()</code> with <code>ERROR_RESULT</code>.</li>
- <li>If an output image buffer could not be filled, but either the metadata was
- produced or some other buffers were filled, the HAL must call <code>notify()</code> with
- <code>ERROR_BUFFER</code> for each failed buffer.</li>
-</ul>
-<p>In each of these transient failure cases, the HAL must still call
-<code>process_capture_result</code>, with valid output <code>buffer_handle_t</code>. If the result
-metadata could not be produced, it should be <code>NULL</code>. If some buffers could not be
- filled, their sync fences must be set to the error state.</p>
-<p>Invalid input arguments result in <code>-EINVAL</code> from the appropriate methods. In that
- case, the framework must act as if that call had never been made.</p>
+<p>HIDL interface methods that interact with the camera must generate
+ the corresponding camera specific
+ <a href="https://source.android.com/reference/hidl/android/hardware/camera/common/1.0/types">status</a>.</p>
+<p>If
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceCallback#notify">ICameraDeviceCallbacks::notify()</a>
+ is called with
+ <a href="https://source.android.com/reference/hidl/android/hardware/camera/device/3.2/types#errorcode">ERROR_DEVICE</a>,
+ only the
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#close">ICameraDeviceSession::close()</a> method can be called successfully. All other
+ methods will return
+ <a href="/reference/hidl/android/hardware/camera/common/1.0/types#status">INTERNAL_ERROR</a>.</p>
+
+<p>Transient errors in image capture must be reported through
+ <a href="https://source.android.com/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceCallback#notify">ICameraDeviceCallbacks::notify()</a>
+ with the appropriate
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/types#errorcode">error code</a>.
+ In each transient failure case, the HAL must still call
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceCallback#processcaptureresult">ICameraDeviceCallbacks::processCaptureResult()</a>
+ with an appropriate
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/types#captureresult">capture result</a>.</p>
+
<h2 id="stream-mgmt">Stream management</h2>
<h3 id="configure_streams">configure_streams</h3>
-<p>Reset the HAL camera device processing pipeline and set up new input and output
- streams. This call replaces any existing stream configuration with the streams
- defined in the <code>stream_list</code>. This method will be called at least once after
- <code>initialize()</code> before a request is submitted with <code>process_capture_request()</code>.</p>
-<p>The <code>stream_list</code> must contain at least one output-capable stream, and may not
- contain more than one input-capable stream.
- The <code>stream_list</code> may contain streams that are also in the currently-active set of
- streams (from the previous call to <code>configure_stream()</code>). These streams will
- already have valid values for usage, maxbuffers, and the private pointer. If
- such a stream has already had its buffers registered, <code>register_stream_buffers()</code>
- will not be called again for the stream, and buffers from the stream can be
- immediately included in input requests.</p>
-<p>If the HAL needs to change the stream configuration for an existing stream due
- to the new configuration, it may rewrite the values of usage and/or maxbuffers
- during the configure call. The framework will detect such a change, and will
- then reallocate the stream buffers, and call <code>register_stream_buffers()</code> again
- before using buffers from that stream in a request.</p>
-<p>If a currently-active stream is not included in <code>stream_list</code>, the HAL may safely
- remove any references to that stream. It will not be reused in a later
- <code>configure()</code> call by the framework, and all the gralloc buffers for it will be
- freed after the <code>configure_streams()</code> call returns.</p>
-<p>The <code>stream_list</code> structure is owned by the framework, and may not be accessed
-once this call completes. The address of an individual <code>camera3streamt</code>
- structure will remain valid for access by the HAL until the end of the first
- <code>configure_stream()</code> call which no longer includes that <code>camera3streamt</code> in the
- <code>stream_list</code> argument. The HAL may not change values in the stream structure
- outside of the private pointer, except for the usage and maxbuffers members
- during the <code>configure_streams()</code> call itself.</p>
-<p>If the stream is new, the usage, maxbuffer, and private pointer fields of the
- stream structure will all be set to 0. The HAL device must set these fields
- before the <code>configure_streams()</code> call returns. These fields are then used by the
- framework and the platform gralloc module to allocate the gralloc buffers for
- each stream.</p>
-<p>Before such a new stream can have its buffers included in a capture request, the
-framework will call <code>register_stream_buffers()</code> with that stream. However, the
- framework is not required to register buffers for _all streams before
- submitting a request. This allows for quick startup of (for example) a preview
- stream, with allocation for other streams happening later or concurrently.</p>
-<h4><strong>Preconditions</strong></h4>
-<p>The framework will only call this method when no captures are being processed.
- That is, all results have been returned to the framework, and all in-flight
- input and output buffers have been returned and their release sync fences have
- been signaled by the HAL. The framework will not submit new requests for capture
- while the <code>configure_streams()</code> call is underway.</p>
-<h4><strong>Postconditions</strong></h4>
-<p>The HAL device must configure itself to provide maximum possible output frame
- rate given the sizes and formats of the output streams, as documented in the
- camera device's static metadata.</p>
-<h4><strong>Performance expectations</strong></h4>
-<p>This call is expected to be heavyweight and possibly take several hundred
- milliseconds to complete, since it may require resetting and reconfiguring the
- image sensor and the camera processing pipeline. Nevertheless, the HAL device
- should attempt to minimize the reconfiguration delay to minimize the
- user-visible pauses during application operational mode changes (such as
- switching from still capture to video recording).</p>
-<h4><strong>Return values</strong></h4>
-<ul>
- <li><code>0</code>: On successful stream configuration</li>
- <li><code>undefined</code></li>
- <li><code>-EINVAL</code>: If the requested stream configuration is invalid. Some examples of
- invalid stream configurations include:
- <ul>
- <li>Including more than 1 input-capable stream (<code>INPUT</code> or <code>BIDIRECTIONAL</code>)</li>
- <li>Not including any output-capable streams (<code>OUTPUT</code> or <code>BIDIRECTIONAL</code>)</li>
- <li>Including streams with unsupported formats, or an unsupported size for
- that format.</li>
- <li>Including too many output streams of a certain format.</li>
- <li>Note that the framework submitting an invalid stream configuration is not
- normal operation, since stream configurations are checked before
- configure. An invalid configuration means that a bug exists in the
- framework code, or there is a mismatch between the HAL's static metadata
- and the requirements on streams.</li>
- </ul>
- </li>
- <li><code>-ENODEV</code>: If there has been a fatal error and the device is no longer
- operational. Only <code>close()</code> can be called successfully by the framework after
- this error is returned.</li>
-</ul>
-<h3 id="register-stream">register_stream_buffers</h3>
-<p>Register buffers for a given stream with the HAL device. This method is called
-by the framework after a new stream is defined by <code>configure_streams</code>, and before
- buffers from that stream are included in a capture request. If the same stream
- is listed in a subsequent <code>configure_streams()</code> call, <code>register_stream_buffers</code> will
- not be called again for that stream.</p>
-<p>The framework does not need to register buffers for all configured streams
- before it submits the first capture request. This allows quick startup for
- preview (or similar use cases) while other streams are still being allocated.</p>
-<p>This method is intended to allow the HAL device to map or otherwise prepare the
- buffers for later use. The buffers passed in will already be locked for use. At
- the end of the call, all the buffers must be ready to be returned to the stream.
- The bufferset argument is only valid for the duration of this call.</p>
-<p>If the stream format was set to <code>HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED</code>, the
- camera HAL should inspect the passed-in buffers here to determine any
- platform-private pixel format information.</p>
-<h4><strong>Return values</strong></h4>
-<ul>
- <li><code>0</code>: On successful registration of the new stream buffers</li>
- <li><code>-EINVAL</code>: If the streambufferset does not refer to a valid active stream, or
- if the buffers array is invalid.</li>
- <li><code>-ENOMEM</code>: If there was a failure in registering the buffers. The framework must
- consider all the stream buffers to be unregistered, and can try to register
- again later.</li>
- <li><code>-ENODEV</code>: If there is a fatal error, and the device is no longer operational.
- Only <code>close()</code> can be called successfully by the framework after this error is
- returned.</li>
-</ul>
+<p>HAL clients must configure camera streams by calling
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#configurestreams">ICameraDeviceSession::configurestreams()</a>.</p>
</body>
</html>
diff --git a/en/devices/camera/camera3_metadata.html b/en/devices/camera/camera3_metadata.html
index 60548200..1594b66d 100644
--- a/en/devices/camera/camera3_metadata.html
+++ b/en/devices/camera/camera3_metadata.html
@@ -24,42 +24,43 @@
<h2 id="metadata">Metadata support</h2>
-<p> To support the saving of raw image files by the Android framework, substantial
- metadata is required about the sensor's characteristics. This includes
+<p> To support the saving of raw image files by the Android framework, substantial
+ metadata is required about the sensor's characteristics. This includes
information such as color spaces and lens shading functions.</p>
-<p>Most of this information is a static property of the camera subsystem and can
- therefore be queried before configuring any output pipelines or submitting any
- requests. The new camera APIs greatly expand the information provided by the
- getCameraInfo() method to provide this information to the application.</p>
-<p>In addition, manual control of the camera subsystem requires feedback from the
- assorted devices about their current state, and the actual parameters used in
- capturing a given frame. The actual values of the controls (exposure time, frame
- duration, and sensitivity) as actually used by the hardware must be included in
- the output metadata. This is essential so that applications know when either
- clamping or rounding took place, and so that the application can compensate for
+<p>Most of this information is a static property of the camera subsystem and can
+ therefore be queried before configuring any output pipelines or submitting any
+ requests. The new camera APIs greatly expand the information provided by the
+ <code>getCameraInfo()</code> method to provide this information to the
+ application.</p>
+<p>In addition, manual control of the camera subsystem requires feedback from the
+ assorted devices about their current state, and the actual parameters used in
+ capturing a given frame. The actual values of the controls (exposure time, frame
+ duration, and sensitivity) as actually used by the hardware must be included in
+ the output metadata. This is essential so that applications know when either
+ clamping or rounding took place, and so that the application can compensate for
the real settings used for image capture.</p>
-<p>For example, if an application sets frame duration to 0 in a request, the HAL
- must clamp the frame duration to the real minimum frame duration for that
+<p>For example, if an application sets frame duration to 0 in a request, the HAL
+ must clamp the frame duration to the real minimum frame duration for that
request, and report that clamped minimum duration in the output result metadata.</p>
-<p>So if an application needs to implement a custom 3A routine (for example, to
- properly meter for an HDR burst), it needs to know the settings used to capture
- the latest set of results it has received in order to update the settings for
- the next request. Therefore, the new camera API adds a substantial amount of
- dynamic metadata to each captured frame. This includes the requested and actual
- parameters used for the capture, as well as additional per-frame metadata such
+<p>So if an application needs to implement a custom 3A routine (for example, to
+ properly meter for an HDR burst), it needs to know the settings used to capture
+ the latest set of results it has received to update the settings for
+ the next request. Therefore, the new camera API adds a substantial amount of
+ dynamic metadata to each captured frame. This includes the requested and actual
+ parameters used for the capture, as well as additional per-frame metadata such
as timestamps and statistics generator output.</p>
<h2 id="per-setting">Per-setting control</h2>
-<p> For most settings, the expectation is that they can be changed every frame,
- without introducing significant stutter or delay to the output frame stream.
- Ideally, the output frame rate should solely be controlled by the capture
- request's frame duration field, and be independent of any changes to processing
- blocks' configuration. In reality, some specific controls are known to be slow
- to change; these include the output resolution and output format of the camera
- pipeline, as well as controls that affect physical devices, such as lens focus
+<p>For most settings, the expectation is that they can be changed every frame,
+ without introducing significant stutter or delay to the output frame stream.
+ Ideally, the output frame rate should solely be controlled by the capture
+ request's frame duration field, and be independent of any changes to processing
+ blocks' configuration. In reality, some specific controls are known to be slow
+ to change; these include the output resolution and output format of the camera
+ pipeline, as well as controls that affect physical devices, such as lens focus
distance. The exact requirements for each control set are detailed later.</p>
<h2 id="raw-sensor">Raw sensor data support</h2>
-<p>In addition to the pixel formats supported by
- the old API, the new API adds a requirement for support for raw sensor data
+<p>In addition to the pixel formats supported by
+ the old API, the new API adds a requirement for support for raw sensor data
(Bayer RAW), both for advanced camera applications as well as to support raw
image files.</p>
diff --git a/en/devices/camera/camera3_requests_hal.html b/en/devices/camera/camera3_requests_hal.html
index 71449af6..314082a2 100644
--- a/en/devices/camera/camera3_requests_hal.html
+++ b/en/devices/camera/camera3_requests_hal.html
@@ -24,57 +24,56 @@
<h2 id="requests">Requests</h2>
-<p> The app framework issues requests for captured results to the camera subsystem.
- One request corresponds to one set of results. A request encapsulates all
- configuration information about the capturing and processing of those results.
- This includes things such as resolution and pixel format; manual sensor, lens,
- and flash control; 3A operating modes; RAW to YUV processing control; and
- statistics generation. This allows for much more control over the results'
- output and processing. Multiple requests can be in flight at once, and
- submitting requests is non-blocking. And the requests are always processed in
- the order they are received.<br/>
+<p> The app framework issues requests for captured results to the camera subsystem.
+ One request corresponds to one set of results. A request encapsulates all
+ configuration information about the capturing and processing of those results.
+ This includes things such as resolution and pixel format; manual sensor, lens,
+ and flash control; 3A operating modes; RAW to YUV processing control; and
+ statistics generation. This allows for much more control over the results'
+ output and processing. Multiple requests can be in flight at once, and
+ submitting requests is non-blocking. And the requests are always processed in
+ the order they are received.</p>
<img src="images/camera_model.png" alt="Camera request model" id="figure1" />
<p class="img-caption">
<strong>Figure 1.</strong> Camera model
</p>
<h2 id="hal-subsystem">The HAL and camera subsystem</h2>
-<p> The camera subsystem includes the implementations for components in the camera
- pipeline such as the 3A algorithm and processing controls. The camera HAL
- provides interfaces for you to implement your versions of these components. To
- maintain cross-platform compatibility between multiple device manufacturers and
- Image Signal Processor (ISP, or camera sensor) vendors, the camera pipeline
- model is virtual and does not directly correspond to any real ISP. However, it
- is similar enough to real processing pipelines so that you can map it to your
- hardware efficiently. In addition, it is abstract enough to allow for multiple
- different algorithms and orders of operation without compromising either
- quality, efficiency, or cross-device compatibility.<br/>
- The camera pipeline also supports triggers that the app framework can initiate
- to turn on things such as auto-focus. It also sends notifications back to the
- app framework, notifying apps of events such as an auto-focus lock or errors.<br/>
+<p> The camera subsystem includes the implementations for components in the camera
+ pipeline such as the 3A algorithm and processing controls. The camera HAL
+ provides interfaces for you to implement your versions of these components. To
+ maintain cross-platform compatibility between multiple device manufacturers and
+ Image Signal Processor (ISP, or camera sensor) vendors, the camera pipeline
+ model is virtual and does not directly correspond to any real ISP. However, it
+ is similar enough to real processing pipelines so that you can map it to your
+ hardware efficiently. In addition, it is abstract enough to allow for multiple
+ different algorithms and orders of operation without compromising either
+ quality, efficiency, or cross-device compatibility.</p>
+<p>The camera pipeline also supports triggers that the app framework can initiate
+ to turn on things such as auto-focus. It also sends notifications back to the
+ app framework, notifying apps of events such as an auto-focus lock or errors.</p>
<img src="images/camera_hal.png" alt="Camera hardware abstraction layer" id="figure2" />
<p class="img-caption">
- <strong>Figure 2.</strong> Camera pipeline
- </p>
- Please note, some image processing blocks shown in the diagram above are not
- well-defined in the initial release.<br/>
- The camera pipeline makes the following assumptions:</p>
+ <strong>Figure 2.</strong> Camera pipeline</p>
+<p>Please note, some image processing blocks shown in the diagram above are not
+ well-defined in the initial release. The camera pipeline makes the following
+ assumptions:</p>
<ul>
<li>RAW Bayer output undergoes no processing inside the ISP.</li>
<li>Statistics are generated based off the raw sensor data.</li>
- <li>The various processing blocks that convert raw sensor data to YUV are in an
+ <li>The various processing blocks that convert raw sensor data to YUV are in an
arbitrary order.</li>
- <li>While multiple scale and crop units are shown, all scaler units share the
- output region controls (digital zoom). However, each unit may have a different
+ <li>While multiple scale and crop units are shown, all scaler units share the
+ output region controls (digital zoom). However, each unit may have a different
output resolution and pixel format.</li>
</ul>
<p><strong>Summary of API use</strong><br/>
- This is a brief summary of the steps for using the Android camera API. See the
- Startup and expected operation sequence section for a detailed breakdown of
+ This is a brief summary of the steps for using the Android camera API. See the
+ Startup and expected operation sequence section for a detailed breakdown of
these steps, including API calls.</p>
<ol>
<li>Listen for and enumerate camera devices.</li>
<li>Open device and connect listeners.</li>
- <li>Configure outputs for target use case (such as still capture, recording,
+ <li>Configure outputs for target use case (such as still capture, recording,
etc.).</li>
<li>Create request(s) for target use case.</li>
<li>Capture/repeat requests and bursts.</li>
@@ -84,13 +83,13 @@
<p><strong>HAL operation summary</strong></p>
<ul>
<li>Asynchronous requests for captures come from the framework.</li>
- <li>HAL device must process requests in order. And for each request, produce
+ <li>HAL device must process requests in order. And for each request, produce
output result metadata, and one or more output image buffers.</li>
- <li>First-in, first-out for requests and results, and for streams referenced by
+ <li>First-in, first-out for requests and results, and for streams referenced by
subsequent requests. </li>
- <li>Timestamps must be identical for all outputs from a given request, so that the
+ <li>Timestamps must be identical for all outputs from a given request, so that the
framework can match them together if needed. </li>
- <li>All capture configuration and state (except for the 3A routines) is
+ <li>All capture configuration and state (except for the 3A routines) is
encapsulated in the requests and results.</li>
</ul>
<img src="images/camera-hal-overview.png" alt="Camera HAL overview" id="figure3" />
@@ -98,208 +97,110 @@
<strong>Figure 3.</strong> Camera HAL overview
</p>
<h2 id="startup">Startup and expected operation sequence</h2>
-<p>This section contains a detailed explanation of the steps expected when using
- the camera API. Please see <a href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera3.h">platform/hardware/libhardware/include/hardware/camera3.h</a> for definitions of these structures and methods.</p>
+<p>This section contains a detailed explanation of the steps expected when using
+ the camera API. Please see <a href="https://android.googlesource.com/platform/hardware/interfaces/+/master/camera/">platform/hardware/interfaces/camera/</a> for HIDL interface
+ definitions.</p>
+
+<h3 id="open-camera-device">Enumerating, opening camera devices and
+creating an active session</h3>
+<ol>
+ <li>After initialization, the framework starts listening for any present
+ camera providers that implement the
+ <code><a href="https://android.googlesource.com/platform/hardware/interfaces/+/master/camera/provider/2.4/ICameraProvider.hal">ICameraProvider</a></code> interface. If such provider or
+ providers are present, the framework will try to establish a connection.</li>
+ <li>The framework enumerates the camera devices via
+ <code>ICameraProvider::getCameraIdList()</code>.</li>
+ <li>The framework instantiates a new <code>ICameraDevice</code> by calling the respective
+ <code>ICameraProvider::getCameraDeviceInterface_VX_X()</code>.</li>
+ <li>The framework calls <code>ICameraDevice::open()</code> to create a new
+ active capture session ICameraDeviceSession.</li>
+</ol>
+
+<h3 id="use-active-session">Using an active camera session</h3>
+
<ol>
- <li>Framework calls camera_module_t-&gt;common.open(), which returns a
- hardware_device_t structure.</li>
- <li>Framework inspects the hardware_device_t-&gt;version field, and instantiates the
- appropriate handler for that version of the camera hardware device. In case
- the version is CAMERA_DEVICE_API_VERSION_3_0, the device is cast to a
- camera3_device_t.</li>
- <li>Framework calls camera3_device_t-&gt;ops-&gt;initialize() with the framework
- callback function pointers. This will only be called this one time after
- open(), before any other functions in the ops structure are called.</li>
- <li>The framework calls camera3_device_t-&gt;ops-&gt;configure_streams() with a list of
- input/output streams to the HAL device.</li>
- <li>The framework allocates gralloc buffers and calls
- camera3_device_t-&gt;ops-&gt;register_stream_buffers() for at least one of the
- output streams listed in configure_streams. The same stream is registered
- only once.</li>
- <li>The framework requests default settings for some number of use cases with
- calls to camera3_device_t-&gt;ops-&gt;construct_default_request_settings(). This
- may occur any time after step 3.</li>
- <li>The framework constructs and sends the first capture request to the HAL with
- settings based on one of the sets of default settings, and with at least one
- output stream that has been registered earlier by the framework. This is sent
- to the HAL with camera3_device_t-&gt;ops-&gt;process_capture_request(). The HAL
- must block the return of this call until it is ready for the next request to
- be sent.</li>
- <li>The framework continues to submit requests, and possibly call
- register_stream_buffers() for not-yet-registered streams, and call
- construct_default_request_settings to get default settings buffers for other
- use cases.</li>
- <li>When the capture of a request begins (sensor starts exposing for the
- capture), the HAL calls camera3_callback_ops_t-&gt;notify() with the SHUTTER
- event, including the frame number and the timestamp for start of exposure.
- This notify call must be made before the first call to
- process_capture_result() for that frame number.</li>
- <li>After some pipeline delay, the HAL begins to return completed captures to
- the framework with camera3_callback_ops_t-&gt;process_capture_result(). These
- are returned in the same order as the requests were submitted. Multiple
- requests can be in flight at once, depending on the pipeline depth of the
+ <li>The framework calls <code>ICameraDeviceSession::configureStreams()</code>
+ with a list of input/output streams to the HAL device.</li>
+ <li>The framework requests default settings for some use cases with
+ calls to <code>ICameraDeviceSession::constructDefaultRequestSettings()</code>.
+ This may occur at any time after the <code>ICameraDeviceSession</code> is
+ created by <code>ICameraDevice::open</code>.
+ </li>
+ <li>The framework constructs and sends the first capture request to the HAL with
+ settings based on one of the sets of default settings, and with at least one
+ output stream that has been registered earlier by the framework. This is sent
+ to the HAL with <code>ICameraDeviceSession::processCaptureRequest()</code>.
+ The HAL must block the return of this call until it is ready for the next
+ request to be sent.</li>
+ <li>The framework continues to submit requests and calls
+ <code>ICameraDeviceSession::constructDefaultRequestSettings()</code> to get
+ default settings buffers for other use cases as necessary.</li>
+ <li>When the capture of a request begins (sensor starts exposing for the
+ capture), the HAL calls <code>ICameraDeviceCallback::notify()</code> with
+ the SHUTTER message, including the frame number and the timestamp for start
+ of exposure. This notify callback does not have to happen before the first
+ <code>processCaptureResult()</code> call for a request, but no results are
+ delivered to an application for a capture until after
+ <code>notify()</code> for that capture is called.
+ </li>
+ <li>After some pipeline delay, the HAL begins to return completed captures to
+ the framework with <code>ICameraDeviceCallback::processCaptureResult()</code>.
+ These are returned in the same order as the requests were submitted. Multiple
+ requests can be in flight at once, depending on the pipeline depth of the
camera HAL device.</li>
- <li>After some time, the framework may stop submitting new requests, wait for
- the existing captures to complete (all buffers filled, all results
- returned), and then call configure_streams() again. This resets the camera
- hardware and pipeline for a new set of input/output streams. Some streams
- may be reused from the previous configuration; if these streams' buffers had
- already been registered with the HAL, they will not be registered again. The
- framework then continues from step 7, if at least one registered output
- stream remains. (Otherwise, step 5 is required first.)</li>
- <li>Alternatively, the framework may call camera3_device_t-&gt;common-&gt;close() to
- end the camera session. This may be called at any time when no other calls
- from the framework are active, although the call may block until all
- in-flight captures have completed (all results returned, all buffers
- filled). After the close call returns, no more calls to the
- camera3_callback_ops_t functions are allowed from the HAL. Once the close()
- call is underway, the framework may not call any other HAL device functions.</li>
- <li>In case of an error or other asynchronous event, the HAL must call
- camera3_callback_ops_t-&gt;notify() with the appropriate error/event message.
- After returning from a fatal device-wide error notification, the HAL should
- act as if close() had been called on it. However, the HAL must either cancel
- or complete all outstanding captures before calling notify(), so that once
- notify() is called with a fatal error, the framework will not receive
- further callbacks from the device. Methods besides close() should return
- -ENODEV or NULL after the notify() method returns from a fatal error
- message.</li>
</ol>
-<img src="images/camera-ops-flow.png" width="600" height="434" alt="Camera operations flow" id="figure4" />
+
+<p>After some time, one of the following will occur:</p>
+ <ul>
+ <li>The framework may stop submitting new requests, wait for
+ the existing captures to complete (all buffers filled, all results
+ returned), and then call <code>ICameraDeviceSession::configureStreams()</code>
+ again. This resets the camera hardware and pipeline for a new set of
+ input/output streams. Some streams may be reused from the previous
+ configuration. The framework then continues from the first capture request
+ to the HAL, if at least one
+ registered output stream remains. (Otherwise,
+ <code>ICameraDeviceSession::configureStreams()</code> is required first.)</li>
+ <li>The framework may call <code>ICameraDeviceSession::close()</code>
+ to end the camera session. This may be called at any time when no other calls
+ from the framework are active, although the call may block until all
+ in-flight captures have completed (all results returned, all buffers
+ filled). After the <code>close()</code> call returns, no more calls to
+ <code>ICameraDeviceCallback</code> are allowed from the HAL. Once the
+ <code>close()</code> call is underway, the framework may not call any other
+ HAL device functions.</li>
+ <li>In case of an error or other asynchronous event, the HAL must call
+ <code>ICameraDeviceCallback::notify()</code> with the appropriate
+ error/event message.
+ After returning from a fatal device-wide error notification, the HAL should
+ act as if <code>close()</code> had been called on it. However, the HAL must
+ either cancel
+ or complete all outstanding captures before calling <code>notify()</code>,
+ so that once
+ <code>notify()</code> is called with a fatal error, the framework will not
+ receive further callbacks from the device. Methods besides
+ <code>close()</code> should return
+ -ENODEV or NULL after the <code>notify()</code> method returns from a fatal
+ error message.</li>
+ </ul>
+<img src="images/camera-ops-flow.png" alt="Camera operations flow" id="figure4" width="485"/>
<p class="img-caption">
<strong>Figure 4.</strong> Camera operational flow
</p>
-<h2 id="ops-modes">Operational modes</h2>
-<p>The camera 3 HAL device can implement one of two possible operational modes:
- limited and full. Full support is expected from new higher-end devices. Limited
- mode has hardware requirements roughly in line with those for a camera HAL
- device v1 implementation, and is expected from older or inexpensive devices.
- Full is a strict superset of limited, and they share the same essential
- operational flow, as documented above.</p>
-<p>The HAL must indicate its level of support with the
- android.info.supportedHardwareLevel static metadata entry, with 0 indicating
- limited mode, and 1 indicating full mode support.</p>
-<p>Roughly speaking, limited-mode devices do not allow for application control of
- capture settings (3A control only), high-rate capture of high-resolution images,
- raw sensor readout, or support for YUV output streams above maximum recording
- resolution (JPEG only for large images).<br/>
- Here are the details of limited-mode behavior:</p>
-<ul>
- <li>Limited-mode devices do not need to implement accurate synchronization between
- capture request settings and the actual image data captured. Instead, changes
- to settings may take effect some time in the future, and possibly not for the
- same output frame for each settings entry. Rapid changes in settings may
- result in some settings never being used for a capture. However, captures that
- include high-resolution output buffers ( &gt; 1080p ) have to use the settings as
- specified (but see below for processing rate).</li>
- <li>Captures in limited mode that include high-resolution (&gt; 1080p) output buffers
- may block in process_capture_request() until all the output buffers have been
- filled. A full-mode HAL device must process sequences of high-resolution
- requests at the rate indicated in the static metadata for that pixel format.
- The HAL must still call process_capture_result() to provide the output; the
- framework must simply be prepared for process_capture_request() to block until
- after process_capture_result() for that request completes for high-resolution
- captures for limited-mode devices.</li>
- <li>Limited-mode devices do not need to support most of the settings/result/static
- info metadata. Only the following settings are expected to be consumed or
- produced by a limited-mode HAL device:
- <ul>
- <li>android.control.aeAntibandingMode (controls)</li>
- <li>android.control.aeExposureCompensation (controls)</li>
- <li>android.control.aeLock (controls)</li>
- <li>android.control.aeMode (controls)</li>
- <li>[OFF means ON_FLASH_TORCH]</li>
- <li>android.control.aeRegions (controls)</li>
- <li>android.control.aeTargetFpsRange (controls)</li>
- <li>android.control.afMode (controls)</li>
- <li>[OFF means infinity focus]</li>
- <li>android.control.afRegions (controls)</li>
- <li>android.control.awbLock (controls)</li>
- <li>android.control.awbMode (controls)</li>
- <li>[OFF not supported]</li>
- <li>android.control.awbRegions (controls)</li>
- <li>android.control.captureIntent (controls)</li>
- <li>android.control.effectMode (controls)</li>
- <li>android.control.mode (controls)</li>
- <li>[OFF not supported]</li>
- <li>android.control.sceneMode (controls)</li>
- <li>android.control.videoStabilizationMode (controls)</li>
- <li>android.control.aeAvailableAntibandingModes (static)</li>
- <li>android.control.aeAvailableModes (static)</li>
- <li>android.control.aeAvailableTargetFpsRanges (static)</li>
- <li>android.control.aeCompensationRange (static)</li>
- <li>android.control.aeCompensationStep (static)</li>
- <li>android.control.afAvailableModes (static)</li>
- <li>android.control.availableEffects (static)</li>
- <li>android.control.availableSceneModes (static)</li>
- <li>android.control.availableVideoStabilizationModes (static)</li>
- <li>android.control.awbAvailableModes (static)</li>
- <li>android.control.maxRegions (static)</li>
- <li>android.control.sceneModeOverrides (static)</li>
- <li>android.control.aeRegions (dynamic)</li>
- <li>android.control.aeState (dynamic)</li>
- <li>android.control.afMode (dynamic)</li>
- <li>android.control.afRegions (dynamic)</li>
- <li>android.control.afState (dynamic)</li>
- <li>android.control.awbMode (dynamic)</li>
- <li>android.control.awbRegions (dynamic)</li>
- <li>android.control.awbState (dynamic)</li>
- <li>android.control.mode (dynamic)</li>
- <li>android.flash.info.available (static)</li>
- <li>android.info.supportedHardwareLevel (static)</li>
- <li>android.jpeg.gpsCoordinates (controls)</li>
- <li>android.jpeg.gpsProcessingMethod (controls)</li>
- <li>android.jpeg.gpsTimestamp (controls)</li>
- <li>android.jpeg.orientation (controls)</li>
- <li>android.jpeg.quality (controls)</li>
- <li>android.jpeg.thumbnailQuality (controls)</li>
- <li>android.jpeg.thumbnailSize (controls)</li>
- <li>android.jpeg.availableThumbnailSizes (static)</li>
- <li>android.jpeg.maxSize (static)</li>
- <li>android.jpeg.gpsCoordinates (dynamic)</li>
- <li>android.jpeg.gpsProcessingMethod (dynamic)</li>
- <li>android.jpeg.gpsTimestamp (dynamic)</li>
- <li>android.jpeg.orientation (dynamic)</li>
- <li>android.jpeg.quality (dynamic)</li>
- <li>android.jpeg.size (dynamic)</li>
- <li>android.jpeg.thumbnailQuality (dynamic)</li>
- <li>android.jpeg.thumbnailSize (dynamic)</li>
- <li>android.lens.info.minimumFocusDistance (static)</li>
- <li>android.request.id (controls)</li>
- <li>android.request.id (dynamic)</li>
- <li>android.scaler.cropRegion (controls)</li>
- <li>[ignores (x,y), assumes center-zoom]</li>
- <li>android.scaler.availableFormats (static)</li>
- <li>[RAW not supported]</li>
- <li>android.scaler.availableJpegMinDurations (static)</li>
- <li>android.scaler.availableJpegSizes (static)</li>
- <li>android.scaler.availableMaxDigitalZoom (static)</li>
- <li>android.scaler.availableProcessedMinDurations (static)</li>
- <li>android.scaler.availableProcessedSizes (static)</li>
- <li>[full resolution not supported]</li>
- <li>android.scaler.maxDigitalZoom (static)</li>
- <li>android.scaler.cropRegion (dynamic)</li>
- <li>android.sensor.orientation (static)</li>
- <li>android.sensor.timestamp (dynamic)</li>
- <li>android.statistics.faceDetectMode (controls)</li>
- <li>android.statistics.info.availableFaceDetectModes (static)</li>
- <li>android.statistics.faceDetectMode (dynamic)</li>
- <li>android.statistics.faceIds (dynamic)</li>
- <li>android.statistics.faceLandmarks (dynamic)</li>
- <li>android.statistics.faceRectangles (dynamic)</li>
- <li>android.statistics.faceScores (dynamic)</li>
- </ul>
- </li>
-</ul>
+<h2 id="hardware-levels">Hardware levels</h2>
+<p>Camera devices can implement several hardware levels depending on their
+ capabilities. For more information, see
+ <a href="https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL">supported hardware level</a>.</p>
<h2 id="interaction">Interaction between the application capture request, 3A
control, and the processing pipeline</h2>
-<p>Depending on the settings in the 3A control block, the camera pipeline ignores
- some of the parameters in the application's capture request and uses the values
- provided by the 3A control routines instead. For example, when auto-exposure is
- active, the exposure time, frame duration, and sensitivity parameters of the
- sensor are controlled by the platform 3A algorithm, and any app-specified values
- are ignored. The values chosen for the frame by the 3A routines must be reported
- in the output metadata. The following table describes the different modes of the
- 3A control block and the properties that are controlled by these modes. See
+<p>Depending on the settings in the 3A control block, the camera pipeline ignores
+ some of the parameters in the application's capture request and uses the values
+ provided by the 3A control routines instead. For example, when auto-exposure is
+ active, the exposure time, frame duration, and sensitivity parameters of the
+ sensor are controlled by the platform 3A algorithm, and any app-specified values
+ are ignored. The values chosen for the frame by the 3A routines must be reported
+ in the output metadata. The following table describes the different modes of the
+ 3A control block and the properties that are controlled by these modes. See
the <a href="https://android.googlesource.com/platform/system/media/+/master/camera/docs/docs.html">platform/system/media/camera/docs/docs.html</a> file for definitions of these properties.</p>
<table>
<tr>
@@ -382,51 +283,49 @@ control, and the processing pipeline</h2>
<td>Can override all parameters listed above. Individual 3A controls are disabled.</td>
</tr>
</table>
-<p>The controls exposed for the 3A algorithm mostly map 1:1 to the old API's
- parameters (such as exposure compensation, scene mode, or white balance mode).<br/>
- The controls in the Image Processing block in Figure 2</a> all
- operate on a similar principle, and generally each block has three modes:</p>
+<p>The controls in the Image Processing block in Figure 2 all operate on a
+ similar principle, and generally each block has three modes:</p>
<ul>
- <li>OFF: This processing block is disabled. The demosaic, color correction, and
+ <li>OFF: This processing block is disabled. The demosaic, color correction, and
tone curve adjustment blocks cannot be disabled.</li>
- <li>FAST: In this mode, the processing block may not slow down the output frame
- rate compared to OFF mode, but should otherwise produce the best-quality
- output it can given that restriction. Typically, this would be used for
- preview or video recording modes, or burst capture for still images. On some
- devices, this may be equivalent to OFF mode (no processing can be done without
- slowing down the frame rate), and on some devices, this may be equivalent to
+ <li>FAST: In this mode, the processing block may not slow down the output frame
+ rate compared to OFF mode, but should otherwise produce the best-quality
+ output it can given that restriction. Typically, this would be used for
+ preview or video recording modes, or burst capture for still images. On some
+ devices, this may be equivalent to OFF mode (no processing can be done without
+ slowing down the frame rate), and on some devices, this may be equivalent to
HIGH_QUALITY mode (best quality still does not slow down frame rate).</li>
- <li>HIGHQUALITY: In this mode, the processing block should produce the best
- quality result possible, slowing down the output frame rate as needed.
- Typically, this would be used for high-quality still capture. Some blocks
- include a manual control which can be optionally selected instead of FAST or
- HIGHQUALITY. For example, the color correction block supports a color
- transform matrix, while the tone curve adjustment supports an arbitrary global
+ <li>HIGH_QUALITY: In this mode, the processing block should produce the best
+ quality result possible, slowing down the output frame rate as needed.
+ Typically, this would be used for high-quality still capture. Some blocks
+ include a manual control which can be optionally selected instead of FAST or
+ HIGH_QUALITY. For example, the color correction block supports a color
+ transform matrix, while the tone curve adjustment supports an arbitrary global
tone mapping curve.</li>
</ul>
- <p>The maximum frame rate that can be supported by a camera subsystem is a function
+ <p>The maximum frame rate that can be supported by a camera subsystem is a function
of many factors:</p>
<ul>
<li>Requested resolutions of output image streams</li>
- <li>Availability of binning / skipping modes on the imager</li>
+ <li>Availability of binning/skipping modes on the imager</li>
<li>The bandwidth of the imager interface</li>
<li>The bandwidth of the various ISP processing blocks</li>
</ul>
-<p>Since these factors can vary greatly between different ISPs and sensors, the
- camera HAL interface tries to abstract the bandwidth restrictions into as simple
+<p>Since these factors can vary greatly between different ISPs and sensors, the
+ camera HAL interface tries to abstract the bandwidth restrictions into as simple
model as possible. The model presented has the following characteristics:</p>
<ul>
- <li>The image sensor is always configured to output the smallest resolution
- possible given the application's requested output stream sizes. The smallest
- resolution is defined as being at least as large as the largest requested
+ <li>The image sensor is always configured to output the smallest resolution
+ possible given the application's requested output stream sizes. The smallest
+ resolution is defined as being at least as large as the largest requested
output stream size.</li>
- <li>Since any request may use any or all the currently configured output streams,
- the sensor and ISP must be configured to support scaling a single capture to
+ <li>Since any request may use any or all the currently configured output streams,
+ the sensor and ISP must be configured to support scaling a single capture to
all the streams at the same time. </li>
- <li>JPEG streams act like processed YUV streams for requests for which they are
- not included; in requests in which they are directly referenced, they act as
+ <li>JPEG streams act like processed YUV streams for requests for which they are
+ not included; in requests in which they are directly referenced, they act as
JPEG streams.</li>
- <li>The JPEG processor can run concurrently to the rest of the camera pipeline but
+ <li>The JPEG processor can run concurrently to the rest of the camera pipeline but
cannot process more than one capture at a time.</li>
</ul>
diff --git a/en/devices/camera/camera3_requests_methods.html b/en/devices/camera/camera3_requests_methods.html
index 6fd429d4..901a236f 100644
--- a/en/devices/camera/camera3_requests_methods.html
+++ b/en/devices/camera/camera3_requests_methods.html
@@ -23,98 +23,22 @@
-<h2 id="request-creation">Request creation and submission</h2>
-<h3 id="default-settings">construct_default_request_settings</h3>
-<p>Create capture settings for standard camera use cases. The device must return a
- settings buffer that is configured to meet the requested use case, which must be
- one of the <code>CAMERA3_TEMPLATE_*</code> enums. All request control fields must be
- included.</p>
-<p>The HAL retains ownership of this structure, but the pointer to the structure
- must be valid until the device is closed. The framework and the HAL may not
- modify the buffer once it is returned by this call. The same buffer may be
- returned for subsequent calls for the same template, or for other templates.</p>
-<h4><strong>Return values</strong></h4>
-<ul>
- <li>Valid metadata: On successful creation of a default settings buffer.</li>
- <li><code>NULL</code>: In case of a fatal error. After this is returned, only the <code>close()</code>
- method can be called successfully by the framework.</li>
-</ul>
-<h3 id="process-request">process_capture_request</h3>
-<p>Send a new capture request to the HAL. The HAL should not return from this call
- until it is ready to accept the next request to process. Only one call to
- <code>process_capture_request()</code> will be made at a time by the framework, and the calls
- will all be from the same thread. The next call to <code>process_capture_request()</code>
- will be made as soon as a new request and its associated buffers are available.
- In a normal preview scenario, this means the function will be called again by
- the framework almost instantly.</p>
-<p>The actual request processing is asynchronous, with the results of capture being
-returned by the HAL through the <code>process_capture_result()</code> call. This call
- requires the result metadata to be available, but output buffers may simply
- provide sync fences to wait on. Multiple requests are expected to be in flight
- at once, to maintain full output frame rate.</p>
-<p>The framework retains ownership of the request structure. It is only guaranteed
- to be valid during this call. The HAL device must make copies of the information
- it needs to retain for the capture processing. The HAL is responsible for
- waiting on and closing the buffers' fences and returning the buffer handles to
- the framework.</p>
-<p>The HAL must write the file descriptor for the input buffer's release sync fence
- into <code>input_buffer</code>-&gt;<code>release_fence</code>, if <code>input_buffer</code> is not <code>NULL</code>. If the HAL
- returns <code>-1</code> for the input buffer release sync fence, the framework is free to
- immediately reuse the input buffer. Otherwise, the framework will wait on the
- sync fence before refilling and reusing the input buffer.</p>
-<h4><strong>Return values</strong></h4>
-<ul>
- <li><code>0</code>: On a successful start to processing the capture request</li>
- <li><code>-EINVAL</code>: If the input is malformed (the settings are <code>NULL</code> when not allowed,
- there are 0 output buffers, etc) and capture processing cannot start. Failures
- during request processing should be handled by calling
- <code>camera3_callback_ops_t.notify()</code>. In case of this error, the framework will
- retain responsibility for the stream buffers' fences and the buffer handles;
- the HAL should not close the fences or return these buffers with
- <code>process_capture_result</code>.</li>
- <li><code>-ENODEV</code>: If the camera device has encountered a serious error. After this
- error is returned, only the <code>close()</code> method can be successfully called by the
- framework.</li>
-</ul>
+<h2 id="default-settings">Default requests</h2>
+<p>To construct default capture requests, call
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#constructdefaultrequestsettings">ICameraDeviceSession::constructDefaultRequestSettings()</a>.</p>
+<h2 id="request-submission">Request submission</h2>
+<p>To submit camera capture requests, call
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#processcapturerequest">ICameraDeviceSession::processCaptureRequest()</a>.</p>
<h2 id="misc-methods">Miscellaneous methods</h2>
-<h3 id="get-metadata">get_metadata_vendor_tag_ops</h3>
-<p>Get methods to query for vendor extension metadata tag information. The HAL
- should fill in all the vendor tag operation methods, or leave ops unchanged if
- no vendor tags are defined. The definition of <code>vendor_tag_query_ops_t</code> can be
- found in <code>system/media/camera/include/system/camera_metadata.h</code>.</p>
-<h3 id="dump">dump</h3>
-<p>Print out debugging state for the camera device. This will be called by the
- framework when the camera service is asked for a debug dump, which happens when
- using the <code>dumpsys</code> tool, or when capturing a bugreport. The passed-in file
- descriptor can be used to write debugging text using <code>dprintf()</code> or <code>write()</code>. The
- text should be in ASCII encoding only.</p>
+<h3 id="request-result-message-queues">Request/result message queues</h3>
+<p>Camera capture result and request IPC overhead can be further optimized via
+ <a href="/devices/architecture/hidl/fmq">fast message queues</a>. Call the
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#getcapturerequestmetadataqueue">ICameraDeviceSession::getCaptureRequestMetadataQueue()</a>
+ and
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#getcaptureresultmetadataqueue">ICameraDeviceSession::getCaptureResultMetadataQueue()</a>
+ methods to query the corresponding queues.</p>
<h3 id="flush">flush</h3>
-<p>Flush all currently in-process captures and all buffers in the pipeline on the
- given device. The framework will use this to dump all state as quickly as
- possible in order to prepare for a <code>configure_streams()</code> call.</p>
-<p>No buffers are required to be successfully returned, so every buffer held at the
-time of <code>flush()</code> (whether successfully filled or not) may be returned with
-<code>CAMERA3_BUFFER_STATUS_ERROR</code>. Note the HAL is still allowed to return valid
-(<code>STATUS_OK</code>) buffers during this call, provided they are successfully filled.</p>
-<p>All requests currently in the HAL are expected to be returned as soon as
- possible. Not-in-process requests should return errors immediately. Any
- interruptible hardware blocks should be stopped, and any uninterruptible blocks
- should be waited on.</p>
-<p><code>flush()</code> should only return when there are no more outstanding buffers or
-requests left in the HAL. The framework may call <code>configure_streams</code> (as the HAL
- state is now quiesced) or may issue new requests.</p>
-<p>A <code>flush()</code> call should only take 100ms or less. The maximum time it can take is 1
- second.</p>
-<h4><strong>Version information</strong></h4>
-<p>This is available only if device version &gt;= <code>CAMERA_DEVICE_API_VERSION_3_1</code>.</p>
-<h4><strong>Return values</strong></h4>
-<ul>
- <li><code>0</code>: On a successful flush of the camera HAL.</li>
- <li><code>-EINVAL</code>: If the input is malformed (the device is not valid).</li>
- <li><code>-ENODEV</code>: If the camera device has encountered a serious error. After this
- error is returned, only the <code>close()</code> method can be successfully called by the
- framework.</li>
-</ul>
-
+<p>To flush any pending capture requests, call
+ <a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#flush">ICameraDeviceSession::flush()</a>.</p>
</body>
</html>
diff --git a/en/devices/camera/images/ape_fwk_camera2.png b/en/devices/camera/images/ape_fwk_camera2.png
new file mode 100644
index 00000000..33c4471b
--- /dev/null
+++ b/en/devices/camera/images/ape_fwk_camera2.png
Binary files differ
diff --git a/en/devices/camera/images/camera-hal-overview.png b/en/devices/camera/images/camera-hal-overview.png
index 266ce6bd..fe168927 100644
--- a/en/devices/camera/images/camera-hal-overview.png
+++ b/en/devices/camera/images/camera-hal-overview.png
Binary files differ
diff --git a/en/devices/camera/index.html b/en/devices/camera/index.html
index c48e07bd..94a313db 100644
--- a/en/devices/camera/index.html
+++ b/en/devices/camera/index.html
@@ -21,32 +21,113 @@
limitations under the License.
-->
-
-
<img style="float: right; margin: 0px 15px 15px 15px;" src="images/ape_fwk_hal_camera.png" alt="Android Camera HAL icon"/>
<p>Android's camera Hardware Abstraction Layer (HAL) connects the higher level
camera framework APIs in
-<a href="http://developer.android.com/reference/android/hardware/package-summary.html">android.hardware</a>
+<a href="http://developer.android.com/reference/android/hardware/package-summary.html">Camera 2</a>
to your underlying camera driver and hardware. The camera subsystem includes
implementations for camera pipeline components while the camera HAL provides
interfaces for use in implementing your version of these components.</p>
-<p>For the most up-to-date information, refer to the following resources:</p>
+<aside class="note"><strong>Note:</strong> If you are implementing the Camera
+ HAL on Android 8.0 and higher, you must use the HIDL interface. For
+ information on the legacy components, see
+ <a href="#architecture-legacy">Legacy HAL components</a>.</aside>
+
+<h2 id="architecture">Architecture</h2>
+<p>The following figure and list describe the HAL components:</p>
+
+<img src="images/ape_fwk_camera2.png" alt="Android camera architecture" id="figure1" />
+<p class="img-caption"><strong>Figure 1.</strong> Camera architecture</p>
+
+<dl>
+ <dt>Application framework</dt>
+ <dd>At the application framework level is the app's code, which uses the
+ <a href="https://developer.android.com/reference/android/hardware/camera2/package-summary">
+ Camera 2</a>
+ API to interact with the camera hardware. Internally, this code calls
+ corresponding <a href="https://developer.android.com/reference/android/os/Binder.html">Binder</a>
+ interfaces to access the native code that interacts with the camera.</dd>
+ <dt>AIDL</dt>
+ <dd>The binder interface associated with CameraService can be found at
+ <a href="https://android.googlesource.com/platform/frameworks/av/+/master/camera/aidl/android/hardware/ICameraService.aidl">frameworks/av/camera/aidl/android/hardware</a>.
+ The generated code calls the lower level native code to obtain access to the
+ physical camera and returns data that is used to create the
+ <a href="https://developer.android.com/reference/android/hardware/camera2/CameraDevice">
+ CameraDevice</a> and eventually
+ <a href="https://developer.android.com/reference/android/hardware/camera2/CameraCaptureSession.html">CameraCaptureSession</a>
+ objects at the framework level.</dd>
+ <dt>Native framework<dt>
+ <dd>This framework residing in <code>frameworks/av/</code> provides a
+ native equivalent to the
+ <a href="https://developer.android.com/reference/android/hardware/camera2/CameraDevice">CameraDevice</a>
+ and
+ <a href="https://developer.android.com/reference/android/hardware/camera2/CameraCaptureSession">CameraCaptureSession</a>
+ classes. See also, <a href="https://developer.android.com/ndk/reference/group/camera">
+ NDK camera2 reference</a>.</dd>
+ <dt>Binder IPC interface</dt>
+ <dd>The IPC binder interface facilitates communication over process boundaries.
+ There are several camera binder classes located in the
+ <code>frameworks/av/camera/camera/aidl/android/hardware</code> directory that
+ call into camera service.
+ <a href="https://android.googlesource.com/platform/frameworks/av/+/master/camera/aidl/android/hardware/ICameraService.aidl">ICameraService</a>
+ is the interface to the camera service;
+ <a href="https://android.googlesource.com/platform/frameworks/av/+/master/camera/aidl/android/hardware/camera2/ICameraDeviceUser.aidl">ICameraDeviceUser</a>
+ is the interface to a specific opened camera device; and
+ <a href="https://android.googlesource.com/platform/frameworks/av/+/master/camera/aidl/android/hardware/ICameraServiceListener.aidl">ICameraServiceListener</a>
+ and
+ <a href="https://android.googlesource.com/platform/frameworks/av/+/master/camera/aidl/android/hardware/camera2/ICameraDeviceCallbacks.aidl">ICameraDeviceCallbacks</a>
+ are the respective CameraService and CameraDevice callbacks to the application
+ framework.</dd>
+ <dt>Camera service</dt>
+ <dd>The camera service, located in
+ <code>frameworks/av/services/camera/libcameraservice/CameraService.cpp</code>,
+ is the actual code that interacts with the HAL.</dd>
+ <dt>HAL</dt>
+ <dd>The hardware abstraction layer defines the standard interface that the
+ camera service calls into and that you must implement to have your camera
+ hardware function correctly.</dd>
+</dl>
+
+<h2 id="implementing">Implementing the HAL</h2>
+<p>The HAL sits between the camera driver and the higher level Android framework
+and defines an interface you must implement so apps can correctly operate the
+camera hardware. From Android 8.0, the Camera HAL interface is part of Project
+<a href="/devices/architecture/treble">Treble</a> and the corresponding
+<a href="/devices/architecture/hidl/">HIDL</a> interfaces are defined in
+<a href="https://android.googlesource.com/platform/hardware/interfaces/+/master/camera/">hardware/interfaces/camera</a>.</p>
+
+<p>A typical binderized HAL must implement the following HIDL interfaces:</p>
<ul>
-<li><a href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera.h">camera.h</a> source
-file</li>
-<li><a href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera3.h">camera3.h</a>
-source file</li>
-<li><a href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera_common.h">camera_common.h</a>
-source file</li>
-<li><a href="https://developer.android.com/reference/android/hardware/camera2/CameraMetadata.html">CameraMetadata</a>
-developer reference</li>
+ <li><a href="/reference/hidl/android/hardware/camera/provider/2.4/ICameraProvider">ICameraProvider</a>:
+ For enumerating individual devices and managing their status.</li>
+ <li><a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDevice">ICameraDevice</a>:
+ The camera device interface.</li>
+ <li><a href="/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession">ICameraDeviceSession</a>:
+ The active camera device session interface.</li>
</ul>
+<p>Reference HIDL implementations are available for
+<a href="https://android.googlesource.com/platform/hardware/interfaces/+/master/camera/provider/2.4/default/CameraProvider.cpp">CameraProvider.cpp</a>,
+<a href="https://android.googlesource.com/platform/hardware/interfaces/+/master/camera/device/3.2/default/CameraDevice.cpp">CameraDevice.cpp</a>
+and
+<a href="https://android.googlesource.com/platform/hardware/interfaces/+/master/camera/device/3.2/default/CameraDeviceSession.cpp">CameraDeviceSession.cpp</a>.
+The implementation wraps old HALs that still use the
+<a href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera3.h">legacy API</a>.
+Starting with Android 8.0, Camera HAL implementations must use the HIDL API; use
+of the legacy interface is not supported.</p>
+<p>For more information on Treble and HAL development, see
+<a href="https://source.android.com/devices/architecture/treble#treble-resources">Treble Resources</a>.</p>
-<h2 id="architecture">Architecture</h2>
-<p>The following figure and list describe the HAL components:</p>
+<h2 id="legacy-hal">Legacy HAL components</h2>
+<p>This section describes the architecture of the legacy HAL components and how to
+implement the HAL. Camera HAL implementations on Android 8.0 and higher must use
+the HIDL API instead, described above.</p>
+
+<h3 id="architecture-legacy">Architecture (legacy)</h3>
+
+<p>The following figure and list describe the legacy camera HAL components:</p>
<img src="images/ape_fwk_camera.png" alt="Android camera architecture" id="figure1" />
<p class="img-caption"><strong>Figure 1.</strong> Camera architecture</p>
@@ -94,7 +175,7 @@ developer reference</li>
display and video recording.</dd>
</dl>
-<h2 id="implementing">Implementing the HAL</h2>
+<h3 id="implementing-legacy">Implementing the HAL (legacy)</h3>
<p>The HAL sits between the camera driver and the higher level Android framework
and defines an interface you must implement so apps can correctly operate the
camera hardware. The HAL interface is defined in the
@@ -122,7 +203,7 @@ These parameters are set with the function pointed to by <code>int
Galaxy Nexus HAL in <code>hardware/ti/omap4xxx/camera</code>.</p>
-<h2 id="configuring">Configuring the shared library</h2>
+<h3 id="configuring">Configuring the shared library</h3>
<p>Set up the Android build system to correctly package the HAL implementation
into a shared library and copy it to the appropriate location by creating an
<code>Android.mk</code> file:</p>
diff --git a/en/devices/camera/versioning.html b/en/devices/camera/versioning.html
index 43795121..03247001 100644
--- a/en/devices/camera/versioning.html
+++ b/en/devices/camera/versioning.html
@@ -26,8 +26,8 @@
<p>This page details version differences in Camera HALs, APIs, and associated
Android Compatibility Test Suite (CTS) tests. It also covers several
architectural changes made to harden and secure the camera framework in Android
-7.0 and the updates vendors must make to support these changes in their camera
-implementations.</p>
+7.0, the switch to Treble in Android 8.0, and the updates vendors must make to
+support these changes in their camera implementations.</p>
<h2 id=glossary>Terminology</h2>
@@ -60,6 +60,20 @@ API1.</dd>
<dt>Camera API2 CTS</dt>
<dd>Additional set of camera CTS tests that run on top of Camera API2.</dd>
+<dt>Treble</dt>
+<dd>Separates the vendor implementation (device-specific, lower-level software
+written by silicon manufacturers) from the Android OS framework via a new
+vendor interface.</dd>
+
+<dt>HIDL</dt>
+<dd><a href="/devices/architecture/hidl/">HAL interface definition language</a>
+introduced with Treble and used to specify the interface between a HAL and
+its users.</dd>
+
+<dt>VTS</dt>
+<dd><a href="/compatibility/vts/">Vendor test suite</a> introduced alongside
+Treble.</dd>
+
</dl>
@@ -129,7 +143,7 @@ allow Google Play filtering of Camera API2 camera apps.</p>
<h2 id=cts_requirements>CTS requirements</h2>
-<p>Devices running Android 5.0 and later must pass the Camera API1 CTS, Camera
+<p>Devices running Android 5.0 and higher must pass the Camera API1 CTS, Camera
API2 CTS, and CTS Verifier camera tests.</p>
<p>Devices that do not feature a Camera HAL3.2 implementation and are not
@@ -146,13 +160,19 @@ are bugs already present in the device’s existing Camera HAL, and thus would
be found by existing Camera API1 apps. We do not expect many bugs of this nature
(however, any such bugs must be fixed to pass the Camera API2 CTS tests).</p>
+<h2 id="vts-requirements">VTS requirements</h2>
+<p>Devices running Android 8.0 and higher with binderized HAL implementations must
+pass the Camera
+<a href="/compatibility/vts/">VTS tests</a>.</p>
+
<h2 id=hardening>Camera framework hardening</h2>
<p>To harden media and camera framework security, Android 7.0 moves camera
-service out of mediaserver. Vendors may need to make changes in the camera HAL
-depending on the API and HAL versions in use. The following sections detail
-architectural changes in AP1 and AP2 for HAL1 and HAL3, as well as general
-requirements.</p>
+service out of mediaserver. Starting with Android 8.0, each binderized Camera
+HAL runs in a process separate from camera service. Vendors may need to make
+changes in the camera HAL depending on the API and HAL versions in use. The
+following sections detail architectural changes in AP1 and AP2 for HAL1 and
+HAL3, as well as general requirements.</p>
<h3 id=hardening_api1>Architectural changes for API1</h3>
<p>API1 video recording may assume camera and video encoder live in the same
@@ -204,7 +224,7 @@ recording. Vendors can measure actual impact by running
<code>android.hardware.camera2.cts.PerformanceTest</code> and the Google Camera
App for 120/240 FPS high speed video recording. Devices also require a small
amount of additional RAM to create the new process.</li>
-<li><strong>Pass metadata in video buffers</strong>(<em>HAL1 only</em>). If HAL1
+<li><strong>Pass metadata in video buffers</strong> (<em>HAL1 only</em>). If HAL1
stores metadata instead of real YUV frame data in video buffers, the HAL must
use <code>kMetadataBufferTypeNativeHandleSource</code> as the metadata buffer
type and pass <code>VideoNativeHandleMetadata</code> in video buffers.
@@ -227,7 +247,12 @@ do not encourage replicating the mediaserver's SELinux policies for cameraserver
(as mediaserver and cameraserver generally require different resources in the
system). Cameraserver should have only the permissions needed to perform camera
functionalities and any unnecessary camera-related permissions in mediaserver
-should be removed.</p>
+should be removed.</li>
+<li><strong>Separation between Camera HAL and cameraserver</strong>. Android
+8.0 and higher additionally separate the binderized Camera HAL in a process
+different from cameraserver. IPC goes through
+ <a href="/devices/architecture/hidl/">HIDL-defined</a> interfaces.</li>
+</ul>
<h3 id=hardening_validation>Validation</h3>
<p>For all devices that include a camera and run Android 7.0, verify the
@@ -235,6 +260,9 @@ implementation by running Android 7.0 CTS. Although Android 7.0 does not include
new CTS tests that verify camera service changes, existing CTS tests will fail
if you have not made the updates indicated above.</p>
+<p>For all devices that include a camera and run Android 8.0 and higher, verify
+the vendor implementation by running VTS.</p>
+
<h2 id="version-history">Camera HAL version history</h2>
<p>For a list of tests available for evaluating the Android Camera HAL, see the
<a href="/compatibility/cts/camera-hal.html">Camera HAL Testing
@@ -243,11 +271,14 @@ Checklist</a>.</p>
<h3 id="80">Android 8.0</h3>
<p>
-The Android 8.0 release contains these key enhancements to the Camera service:
+The Android 8.0 release introduces Treble. With Treble, vendor Camera HAL
+implementations must be
+<a href="/devices/architecture/hal-types">binderized</a>. Android 8.0 also
+contains these key enhancements to the Camera service:
</p>
<ul>
- <li>Shared surfaces - Enable multiple surfaces sharing the same
+ <li>Shared surfaces: Enable multiple surfaces sharing the same
<code>OutputConfiguration</code></li>
<li>System API for custom camera modes</li>
<li><code>onCaptureQueueEmpty</code></li>
@@ -284,9 +315,10 @@ The public camera API defines two operating modes: normal and constrained
high-speed recording. They have fairly different semantics; high-speed mode is
limited to at most two specific outputs at once, etc. Various OEMs have
expressed interest in defining other custom modes for hardware-specific
-capabilities. Under the hood, the mode is just an integer passed to the
-configure_streams. See:
-<code>hardware/libhardware/+/master/include/hardware/camera3.h#1736</code>
+capabilities. Under the hood, the mode is just an integer passed to
+<code>configure_streams</code>. See:
+<a href="https://source.android.com/reference/hidl/android/hardware/camera/device/3.2/ICameraDeviceSession#configurestreams">
+<code>hardware/camera/device/3.2/ICameraDeviceSession#configurestreams</code></a>
</p>
<p>
@@ -305,10 +337,12 @@ their custom camera app use the system API.
The method name is <code><a
href="https://developer.android.com/reference/android/hardware/camera2/CameraCaptureSession.StateCallback.html#onCaptureQueueEmpty(android.hardware.camera2.CameraCaptureSession)">android.hardware.camera2.CameraDevice#createCustomCaptureSession</a></code>.
See:
-<code>frameworks/base/core/java/android/hardware/camera2/CameraDevice.java#797</code>
+<a href="https://android.googlesource.com/platform/frameworks/base/+/master/core/java/android/hardware/camera2/CameraDevice.java#805">
+ <code>frameworks/base/core/java/android/hardware/camera2/CameraDevice.java#797</code></a></p>
-<p class="note"><strong>Note:</strong> In the Android 8.0 MR1 release, applications must be preinstalled on the system image to access this API.
-</p>
+<aside class="note"><strong>Note:</strong> In the Android 8.1 release,
+ applications must be preinstalled on the system image to access this API.
+</aside>
<h4 id="oncapturequeueempty">onCaptureQueueEmpty</h4>
@@ -321,6 +355,12 @@ appropriately. Generally that's by sending another capture request to the camera
device.
</p>
+<h4 id="camera-hidl-interface">Camera HIDL interface</h4>
+<p>The Camera HIDL interface is a complete overhaul of the Camera HAL interface
+that uses stable HIDL-defined APIs. All features and camera capabilities
+introduced in the most recent legacy versions 3.4 and 2.4 (for the camera
+module) are also part of the HIDL definitions.</p>
+
<h3 id="34">3.4</h3>
<p>Minor additions to supported metadata and changes to data_space support:</p>
@@ -421,7 +461,7 @@ API.</li>
capture, reprocessing of RAW data, etc.</li>
</ul>
-<h3 id="10">1.0</strong></h3>
+<h3 id="10">1.0</h3>
<p>Initial Android camera HAL (Android 4.0) [camera.h]:</p>