aboutsummaryrefslogtreecommitdiff
path: root/en/devices/audio
diff options
context:
space:
mode:
authorBilly Lamberta <blamb@google.com>2017-03-30 12:15:17 -0700
committerBilly Lamberta <blamb@google.com>2017-04-04 12:05:29 -0700
commit90dd9129329995537d4097cdd3263e9982997578 (patch)
tree6eebe8d85002c1264a4184d434e5d1f2ac0f9a48 /en/devices/audio
parenta3b748b40bab557fb47fe5a48a5bfb642837fb05 (diff)
downloadsource.android.com-90dd9129329995537d4097cdd3263e9982997578.tar.gz
Docs: Renaming filenames to match new SAC system.
Hoping to keep file history with the initial sync back. Will require a few commits and merges to keep git from getting confused. This commit puts the files in place, the next will sync the changes in place. Add international files for security. Remove old Android build targets and scripts. Test: None Change-Id: I900f6bcce3b687ff7e64e1cc657375c0205e256c
Diffstat (limited to 'en/devices/audio')
-rw-r--r--en/devices/audio/attributes.html257
-rw-r--r--en/devices/audio/avoiding_pi.html338
-rw-r--r--en/devices/audio/data_formats.html405
-rw-r--r--en/devices/audio/debugging.html449
-rw-r--r--en/devices/audio/images/ape_audio_tv_hdmi_tuner.pngbin0 -> 39689 bytes
-rw-r--r--en/devices/audio/images/ape_audio_tv_tif.pngbin0 -> 36202 bytes
-rw-r--r--en/devices/audio/images/ape_audio_tv_tuner.pngbin0 -> 39572 bytes
-rw-r--r--en/devices/audio/images/ape_fwk_audio.pngbin0 -> 77117 bytes
-rw-r--r--en/devices/audio/images/ape_fwk_hal_audio.pngbin0 -> 3573 bytes
-rw-r--r--en/devices/audio/images/audio_hal.pngbin0 -> 124558 bytes
-rw-r--r--en/devices/audio/images/breadboard.jpgbin0 -> 44525 bytes
-rw-r--r--en/devices/audio/images/dac.pngbin0 -> 40915 bytes
-rw-r--r--en/devices/audio/images/display.jpgbin0 -> 36519 bytes
-rw-r--r--en/devices/audio/images/hub.jpgbin0 -> 38207 bytes
-rw-r--r--en/devices/audio/images/loopback_assembled.jpgbin0 -> 12091 bytes
-rw-r--r--en/devices/audio/images/loopback_circuit.pngbin0 -> 11722 bytes
-rw-r--r--en/devices/audio/images/medialog_after.pngbin0 -> 50785 bytes
-rw-r--r--en/devices/audio/images/medialog_before.pngbin0 -> 8789 bytes
-rw-r--r--en/devices/audio/images/otg.jpgbin0 -> 30943 bytes
-rw-r--r--en/devices/audio/images/pcb.jpgbin0 -> 54550 bytes
-rw-r--r--en/devices/audio/images/round_trip.pngbin0 -> 2779 bytes
-rw-r--r--en/devices/audio/images/round_trip_bar_graph.pngbin0 -> 35361 bytes
-rw-r--r--en/devices/audio/images/round_trip_on_device.pngbin0 -> 29734 bytes
-rw-r--r--en/devices/audio/images/round_trip_via_headset_connector.pngbin0 -> 44248 bytes
-rw-r--r--en/devices/audio/images/venn.pngbin0 -> 53129 bytes
-rw-r--r--en/devices/audio/implement-policy.html446
-rw-r--r--en/devices/audio/implement-pre-processing.html154
-rw-r--r--en/devices/audio/implement-shared-library.html95
-rw-r--r--en/devices/audio/implement.html69
-rw-r--r--en/devices/audio/index.html122
-rw-r--r--en/devices/audio/latency.html57
-rw-r--r--en/devices/audio/latency_app.html180
-rw-r--r--en/devices/audio/latency_contrib.html220
-rw-r--r--en/devices/audio/latency_design.html236
-rw-r--r--en/devices/audio/latency_measure.html239
-rw-r--r--en/devices/audio/latency_measurements.html474
-rw-r--r--en/devices/audio/loopback.html58
-rw-r--r--en/devices/audio/midi.html178
-rw-r--r--en/devices/audio/midi_arch.html231
-rw-r--r--en/devices/audio/midi_test.html267
-rw-r--r--en/devices/audio/src.html118
-rw-r--r--en/devices/audio/terminology.html803
-rw-r--r--en/devices/audio/testing_circuit.html94
-rw-r--r--en/devices/audio/tv.html302
-rw-r--r--en/devices/audio/usb.html632
-rw-r--r--en/devices/audio/warmup.html114
46 files changed, 6538 insertions, 0 deletions
diff --git a/en/devices/audio/attributes.html b/en/devices/audio/attributes.html
new file mode 100644
index 00000000..0f4beefe
--- /dev/null
+++ b/en/devices/audio/attributes.html
@@ -0,0 +1,257 @@
+page.title=Audio Attributes
+@jd:body
+
+<!--
+ Copyright 2014 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>Audio players support attributes that define how the audio system handles routing, volume, and
+focus decisions for the specified source. Applications can attach attributes to an audio playback
+(such as music played by a streaming service or a notification for a new email) then pass the audio
+source attributes to the framework, where the audio system uses the attributes to make mixing
+decisions and to notify applications about the state of the system.</p>
+
+<p class="note"><strong>Note:</strong> Applications can also attach attributes to an audio
+recording (such as audio captured in a video recording), but this functionality is not exposed in
+the public API.</p>
+
+<p>In Android 4.4 and earlier, the framework made mixing decisions using only the audio stream type.
+However, basing such decisions on stream type was too limiting to produce quality output across
+multiple applications and devices. For example, on a mobile device, some applications (i.e.
+Google Maps) played driving directions on the STREAM_MUSIC stream type; however, on mobile
+devices in projection mode (i.e. Android Auto), applications cannot mix driving directions with
+other media streams.</p>
+
+<p>Using the <a href="http://developer.android.com/reference/android/media/AudioAttributes.html">
+audio attribute API</a>, applications can now provide the audio system with detailed information
+about a specific audio source:</p>
+
+<ul>
+<li><b>Usage</b>. Specifies why the source is playing and controls routing, focus, and volume
+decisions.</li>
+<li><b>Content type</b>. Specifies what the source is playing (music, movie, speech,
+sonification, unknown).</li>
+<li><b>Flags</b>. Specifies how the source should be played. Includes support for audibility
+enforcement (camera shutter sounds required in some countries) and hardware audio/video
+synchronization.</li>
+</ul>
+
+<p>For dynamics processing, applications must distinguish between movie, music, and speech content.
+Information about the data itself may also matter, such as loudness and peak sample value.</p>
+
+<h2 id="using">Using attributes</h2>
+
+<p>Usage specifies the context in which the stream is used, providing information about why the
+sound is playing and what the sound is used for. Usage information is more expressive than a stream
+type and allows platforms or routing policies to refine volume or routing decisions.</p>
+
+<p>Supply one of the following usage values for any instance:</p>
+
+<ul>
+<li><code>USAGE_UNKNOWN</code></li>
+<li><code>USAGE_MEDIA</code></li>
+<li><code>USAGE_VOICE_COMMUNICATION</code></li>
+<li><code>USAGE_VOICE_COMMUNICATION_SIGNALLING</code></li>
+<li><code>USAGE_ALARM</code></li>
+<li><code>USAGE_NOTIFICATION</code></li>
+<li><code>USAGE_NOTIFICATION_RINGTONE</code></li>
+<li><code>USAGE_NOTIFICATION_COMMUNICATION_INSTANT</code></li>
+<li><code>USAGE_NOTIFICATION_COMMUNICATION_DELAYED</code></li>
+<li><code>USAGE_NOTIFICATION_EVENT</code></li>
+<li><code>USAGE_ASSISTANCE_ACCESSIBILITY</code></li>
+<li><code>USAGE_ASSISTANCE_NAVIGATION_GUIDANCE</code></li>
+<li><code>USAGE_ASSISTANCE_SONIFICATION</code></li>
+<li><code>USAGE_GAME</code></li>
+</ul>
+
+<p>Audio attribute usage values are mutually exclusive. For examples, refer to <code>
+<a href="http://developer.android.com/reference/android/media/AudioAttributes.html#USAGE_MEDIA">
+USAGE_MEDIA</a></code> and <code>
+<a href="http://developer.android.com/reference/android/media/AudioAttributes.html#USAGE_ALARM">
+USAGE_ALARM</a></code> definitions; for exceptions, refer to the <code>
+<a href="http://developer.android.com/reference/android/media/AudioAttributes.Builder.html">
+AudioAttributes.Builder</a></code> definition.</p>
+
+<h2 id="content-type">Content type</h2>
+
+<p>Content type defines what the sound is and expresses the general category of the content such as
+movie, speech, or beep/ringtone. The audio framework uses content type information to selectively
+configure audio post-processing blocks. While supplying the content type is optional, you should
+include type information whenever the content type is known, such as using
+<code>CONTENT_TYPE_MOVIE</code> for a movie streaming service or <code>CONTENT_TYPE_MUSIC</code>
+for a music playback application.</p>
+
+<p>Supply one of the following content type values for any instance:</p>
+
+<ul>
+<li><code>CONTENT_TYPE_UNKNOWN</code> (default)</li>
+<li><code>CONTENT_TYPE_MOVIE</code></li>
+<li><code>CONTENT_TYPE_MUSIC</code></li>
+<li><code>CONTENT_TYPE_SONIFICATION</code></li>
+<li><code>CONTENT_TYPE_SPEECH</code></li>
+</ul>
+
+<p>Audio attribute content type values are mutually exclusive. For details on content types,
+refer to the <a href="http://developer.android.com/reference/android/media/AudioAttributes.html">
+audio attribute API</a>.</p>
+
+<h2 id="flags">Flags</h2>
+
+<p>Flags specify how the audio framework applies effects to the audio playback. Supply one or more
+of the following flags for an instance:</p>
+
+<ul>
+<li><code>FLAG_AUDIBILITY_ENFORCED</code>. Requests the system ensure the audibility of the
+sound. Use to address the needs of legacy <code>STREAM_SYSTEM_ENFORCED</code> (such as forcing
+camera shutter sounds).</li>
+<li><code>HW_AV_SYNC</code>. Requests the system select an output stream that supports hardware A/V
+synchronization.</li>
+</ul>
+
+<p>Audio attribute flags are non-exclusive (can be combined). For details on these flags,
+refer to the <a href="http://developer.android.com/reference/android/media/AudioAttributes.html">
+audio attribute API</a>.</p>
+
+<h2 id="example">Example</h2>
+
+<p>In this example, AudioAttributes.Builder defines the AudioAttributes to be used by a new
+AudioTrack instance:</p>
+
+<pre>
+AudioTrack myTrack = new AudioTrack(
+ new AudioAttributes.Builder()
+ .setUsage(AudioAttributes.USAGE_MEDIA)
+ .setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
+ .build(),
+ myFormat, myBuffSize, AudioTrack.MODE_STREAM, mySession);
+</pre>
+
+<h2 id="compatibility">Compatibility</h2>
+
+<p>Application developers should use audio attributes when creating or updating applications for
+Android 5.0. However, applications are not required to take advantage of attributes; they can
+handle legacy stream types only or remain unaware of attributes (i.e. a generic media player that
+doesn't know anything about the content it's playing).</p>
+
+<p>In such cases, the framework maintains backwards compatibility with older devices and Android
+releases by automatically translating legacy audio stream types to audio attributes. However, the
+framework does not enforce or guarantee this mapping across devices, manufacturers, or Android
+releases.</p>
+
+<p>Compatibility mappings:</p>
+
+<table>
+<tr>
+ <th>Android 5.0</th>
+ <th>Android 4.4 and earlier</th>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_SPEECH</code><br />
+ <code>USAGE_VOICE_COMMUNICATION</code>
+ </td>
+ <td>
+ <code>STREAM_VOICE_CALL</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_SONIFICATION</code><br />
+ <code>USAGE_ASSISTANCE_SONIFICATION</code>
+ </td>
+ <td>
+ <code>STREAM_SYSTEM</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_SONIFICATION</code><br />
+ <code>USAGE_NOTIFICATION_RINGTONE</code>
+ </td>
+ <td>
+ <code>STREAM_RING</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_MUSIC</code><br />
+ <code>USAGE_UNKNOWN</code><br />
+ <code>USAGE_MEDIA</code><br />
+ <code>USAGE_GAME</code><br />
+ <code>USAGE_ASSISTANCE_ACCESSIBILITY</code><br />
+ <code>USAGE_ASSISTANCE_NAVIGATION_GUIDANCE</code>
+ </td>
+ <td>
+ <code>STREAM_MUSIC</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_SONIFICATION</code><br />
+ <code>USAGE_ALARM</code>
+ </td>
+ <td>
+ <code>STREAM_ALARM</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_SONIFICATION</code><br />
+ <code>USAGE_NOTIFICATION</code><br />
+ <code>USAGE_NOTIFICATION_COMMUNICATION_REQUEST</code><br />
+ <code>USAGE_NOTIFICATION_COMMUNICATION_INSTANT</code><br />
+ <code>USAGE_NOTIFICATION_COMMUNICATION_DELAYED</code><br />
+ <code>USAGE_NOTIFICATION_EVENT</code>
+ </td>
+ <td>
+ <code>STREAM_NOTIFICATION</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_SPEECH</code>
+ </td>
+ <td>
+ (@hide)<code> STREAM_BLUETOOTH_SCO</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>FLAG_AUDIBILITY_ENFORCED</code>
+ </td>
+ <td>
+ (@hide)<code> STREAM_SYSTEM_ENFORCED</code>
+ </td>
+</tr>
+<tr>
+ <td>
+ <code>CONTENT_TYPE_SONIFICATION</code><br />
+ <code>USAGE_VOICE_COMMUNICATION_SIGNALLING</code>
+ </td>
+ <td>
+ (@hide)<code> STREAM_DTMF</code>
+ </td>
+</tr>
+</table>
+
+<p class="note"><strong>Note:</strong> @hide streams are used internally by the framework but are
+not part of the public API.</p>
diff --git a/en/devices/audio/avoiding_pi.html b/en/devices/audio/avoiding_pi.html
new file mode 100644
index 00000000..602c545b
--- /dev/null
+++ b/en/devices/audio/avoiding_pi.html
@@ -0,0 +1,338 @@
+page.title=Avoiding Priority Inversion
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+This article explains how the Android's audio system attempts to avoid
+priority inversion,
+and highlights techniques that you can use too.
+</p>
+
+<p>
+These techniques may be useful to developers of high-performance
+audio apps, OEMs, and SoC providers who are implementing an audio
+HAL. Please note implementing these techniques is not
+guaranteed to prevent glitches or other failures, particularly if
+used outside of the audio context.
+Your results may vary, and you should conduct your own
+evaluation and testing.
+</p>
+
+<h2 id="background">Background</h2>
+
+<p>
+The Android AudioFlinger audio server and AudioTrack/AudioRecord
+client implementation are being re-architected to reduce latency.
+This work started in Android 4.1, and continued with further improvements
+in 4.2, 4.3, 4.4, and 5.0.
+</p>
+
+<p>
+To achieve this lower latency, many changes were needed throughout the system. One
+important change is to assign CPU resources to time-critical
+threads with a more predictable scheduling policy. Reliable scheduling
+allows the audio buffer sizes and counts to be reduced while still
+avoiding underruns and overruns.
+</p>
+
+<h2 id="priorityInversion">Priority inversion</h2>
+
+<p>
+<a href="http://en.wikipedia.org/wiki/Priority_inversion">Priority inversion</a>
+is a classic failure mode of real-time systems,
+where a higher-priority task is blocked for an unbounded time waiting
+for a lower-priority task to release a resource such as (shared
+state protected by) a
+<a href="http://en.wikipedia.org/wiki/Mutual_exclusion">mutex</a>.
+</p>
+
+<p>
+In an audio system, priority inversion typically manifests as a
+<a href="http://en.wikipedia.org/wiki/Glitch">glitch</a>
+(click, pop, dropout),
+<a href="http://en.wikipedia.org/wiki/Max_Headroom_(character)">repeated audio</a>
+when circular buffers
+are used, or delay in responding to a command.
+</p>
+
+<p>
+A common workaround for priority inversion is to increase audio buffer sizes.
+However, this method increases latency and merely hides the problem
+instead of solving it. It is better to understand and prevent priority
+inversion, as seen below.
+</p>
+
+<p>
+In the Android audio implementation, priority inversion is most
+likely to occur in these places. And so you should focus your attention here:
+</p>
+
+<ul>
+
+<li>
+between normal mixer thread and fast mixer thread in AudioFlinger
+</li>
+
+<li>
+between application callback thread for a fast AudioTrack and
+fast mixer thread (they both have elevated priority, but slightly
+different priorities)
+</li>
+
+<li>
+between application callback thread for a fast AudioRecord and
+fast capture thread (similar to previous)
+</li>
+
+<li>
+within the audio Hardware Abstraction Layer (HAL) implementation, e.g. for telephony or echo cancellation
+</li>
+
+<li>
+within the audio driver in kernel
+</li>
+
+<li>
+between AudioTrack or AudioRecord callback thread and other app threads (this is out of our control)
+</li>
+
+</ul>
+
+<h2 id="commonSolutions">Common solutions</h2>
+
+<p>
+The typical solutions include:
+</p>
+
+<ul>
+
+<li>
+disabling interrupts
+</li>
+
+<li>
+priority inheritance mutexes
+</li>
+
+</ul>
+
+<p>
+Disabling interrupts is not feasible in Linux user space, and does
+not work for Symmetric Multi-Processors (SMP).
+</p>
+
+
+<p>
+Priority inheritance
+<a href="http://en.wikipedia.org/wiki/Futex">futexes</a>
+(fast user-space mutexes) are available
+in Linux kernel, but are not currently exposed by the Android C
+runtime library
+<a href="http://en.wikipedia.org/wiki/Bionic_(software)">Bionic</a>.
+They are not used in the audio system because they are relatively heavyweight,
+and because they rely on a trusted client.
+</p>
+
+<h2 id="androidTechniques">Techniques used by Android</h2>
+
+<p>
+Experiments started with "try lock" and lock with timeout. These are
+non-blocking and bounded blocking variants of the mutex lock
+operation. Try lock and lock with timeout worked fairly well but were
+susceptible to a couple of obscure failure modes: the
+server was not guaranteed to be able to access the shared state if
+the client happened to be busy, and the cumulative timeout could
+be too long if there was a long sequence of unrelated locks that
+all timed out.
+</p>
+
+
+<p>
+We also use
+<a href="http://en.wikipedia.org/wiki/Linearizability">atomic operations</a>
+such as:
+</p>
+
+<ul>
+<li>increment</li>
+<li>bitwise "or"</li>
+<li>bitwise "and"</li>
+</ul>
+
+<p>
+All of these return the previous value and include the necessary
+SMP barriers. The disadvantage is they can require unbounded retries.
+In practice, we've found that the retries are not a problem.
+</p>
+
+<p class="note"><strong>Note:</strong> Atomic operations and their interactions with memory barriers
+are notoriously badly misunderstood and used incorrectly. We include these methods
+here for completeness but recommend you also read the article
+<a href="https://developer.android.com/training/articles/smp.html">
+SMP Primer for Android</a>
+for further information.
+</p>
+
+<p>
+We still have and use most of the above tools, and have recently
+added these techniques:
+</p>
+
+<ul>
+
+<li>
+Use non-blocking single-reader single-writer
+<a href="http://en.wikipedia.org/wiki/Circular_buffer">FIFO queues</a>
+for data.
+</li>
+
+<li>
+Try to
+<i>copy</i>
+state rather than
+<i>share</i>
+state between high- and
+low-priority modules.
+</li>
+
+<li>
+When state does need to be shared, limit the state to the
+maximum-size
+<a href="http://en.wikipedia.org/wiki/Word_(computer_architecture)">word</a>
+that can be accessed atomically in one-bus operation
+without retries.
+</li>
+
+<li>
+For complex multi-word state, use a state queue. A state queue
+is basically just a non-blocking single-reader single-writer FIFO
+queue used for state rather than data, except the writer collapses
+adjacent pushes into a single push.
+</li>
+
+<li>
+Pay attention to
+<a href="http://en.wikipedia.org/wiki/Memory_barrier">memory barriers</a>
+for SMP correctness.
+</li>
+
+<li>
+<a href="http://en.wikipedia.org/wiki/Trust,_but_verify">Trust, but verify</a>.
+When sharing
+<i>state</i>
+between processes, don't
+assume that the state is well-formed. For example, check that indices
+are within bounds. This verification isn't needed between threads
+in the same process, between mutual trusting processes (which
+typically have the same UID). It's also unnecessary for shared
+<i>data</i>
+such as PCM audio where a corruption is inconsequential.
+</li>
+
+</ul>
+
+<h2 id="nonBlockingAlgorithms">Non-blocking algorithms</h2>
+
+<p>
+<a href="http://en.wikipedia.org/wiki/Non-blocking_algorithm">Non-blocking algorithms</a>
+have been a subject of much recent study.
+But with the exception of single-reader single-writer FIFO queues,
+we've found them to be complex and error-prone.
+</p>
+
+<p>
+Starting in Android 4.2, you can find our non-blocking,
+single-reader/writer classes in these locations:
+</p>
+
+<ul>
+
+<li>
+frameworks/av/include/media/nbaio/
+</li>
+
+<li>
+frameworks/av/media/libnbaio/
+</li>
+
+<li>
+frameworks/av/services/audioflinger/StateQueue*
+</li>
+
+</ul>
+
+<p>
+These were designed specifically for AudioFlinger and are not
+general-purpose. Non-blocking algorithms are notorious for being
+difficult to debug. You can look at this code as a model. But be
+aware there may be bugs, and the classes are not guaranteed to be
+suitable for other purposes.
+</p>
+
+<p>
+For developers, some of the sample OpenSL ES application code should be updated to
+use non-blocking algorithms or reference a non-Android open source library.
+</p>
+
+<p>
+We have published an example non-blocking FIFO implementation that is specifically designed for
+application code. See these files located in the platform source directory
+<code>frameworks/av/audio_utils</code>:
+</p>
+<ul>
+ <li><a href="https://android.googlesource.com/platform/system/media/+/master/audio_utils/include/audio_utils/fifo.h">include/audio_utils/fifo.h</a></li>
+ <li><a href="https://android.googlesource.com/platform/system/media/+/master/audio_utils/fifo.c">fifo.c</a></li>
+ <li><a href="https://android.googlesource.com/platform/system/media/+/master/audio_utils/include/audio_utils/roundup.h">include/audio_utils/roundup.h</a></li>
+ <li><a href="https://android.googlesource.com/platform/system/media/+/master/audio_utils/roundup.c">roundup.c</a></li>
+</ul>
+
+<h2 id="tools">Tools</h2>
+
+<p>
+To the best of our knowledge, there are no automatic tools for
+finding priority inversion, especially before it happens. Some
+research static code analysis tools are capable of finding priority
+inversions if able to access the entire codebase. Of course, if
+arbitrary user code is involved (as it is here for the application)
+or is a large codebase (as for the Linux kernel and device drivers),
+static analysis may be impractical. The most important thing is to
+read the code very carefully and get a good grasp on the entire
+system and the interactions. Tools such as
+<a href="http://developer.android.com/tools/help/systrace.html">systrace</a>
+and
+<code>ps -t -p</code>
+are useful for seeing priority inversion after it occurs, but do
+not tell you in advance.
+</p>
+
+<h2 id="aFinalWord">A final word</h2>
+
+<p>
+After all of this discussion, don't be afraid of mutexes. Mutexes
+are your friend for ordinary use, when used and implemented correctly
+in ordinary non-time-critical use cases. But between high- and
+low-priority tasks and in time-sensitive systems mutexes are more
+likely to cause trouble.
+</p>
diff --git a/en/devices/audio/data_formats.html b/en/devices/audio/data_formats.html
new file mode 100644
index 00000000..b04f85bc
--- /dev/null
+++ b/en/devices/audio/data_formats.html
@@ -0,0 +1,405 @@
+page.title=Data Formats
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+Android uses a wide variety of audio
+<a href="http://en.wikipedia.org/wiki/Data_format">data formats</a>
+internally, and exposes a subset of these in public APIs,
+<a href="http://en.wikipedia.org/wiki/Audio_file_format">file formats</a>,
+and the
+<a href="https://en.wikipedia.org/wiki/Hardware_abstraction">Hardware Abstraction Layer</a> (HAL).
+</p>
+
+<h2 id="properties">Properties</h2>
+
+<p>
+The audio data formats are classified by their properties:
+</p>
+
+<dl>
+
+ <dt><a href="https://en.wikipedia.org/wiki/Data_compression">Compression</a></dt>
+ <dd>
+ <a href="http://en.wikipedia.org/wiki/Raw_data">Uncompressed</a>,
+ <a href="http://en.wikipedia.org/wiki/Lossless_compression">lossless compressed</a>, or
+ <a href="http://en.wikipedia.org/wiki/Lossy_compression">lossy compressed</a>.
+ PCM is the most common uncompressed audio format. FLAC is a lossless compressed
+ format, while MP3 and AAC are lossy compressed formats.
+ </dd>
+
+ <dt><a href="http://en.wikipedia.org/wiki/Audio_bit_depth">Bit depth</a></dt>
+ <dd>
+ Number of significant bits per audio sample.
+ </dd>
+
+ <dt><a href="https://en.wikipedia.org/wiki/Sizeof">Container size</a></dt>
+ <dd>
+ Number of bits used to store or transmit a sample. Usually
+ this is the same as the bit depth, but sometimes additional
+ padding bits are allocated for alignment. For example, a
+ 24-bit sample could be contained within a 32-bit word.
+ </dd>
+
+ <dt><a href="http://en.wikipedia.org/wiki/Data_structure_alignment">Alignment</a></dt>
+ <dd>
+ If the container size is exactly equal to the bit depth, the
+ representation is called <em>packed</em>. Otherwise the representation is
+ <em>unpacked</em>. The significant bits of the sample are typically
+ aligned with either the leftmost (most significant) or rightmost
+ (least significant) bit of the container. It is conventional to use
+ the terms <em>packed</em> and <em>unpacked</em> only when the bit
+ depth is not a
+ <a href="http://en.wikipedia.org/wiki/Power_of_two">power of two</a>.
+ </dd>
+
+ <dt><a href="http://en.wikipedia.org/wiki/Signedness">Signedness</a></dt>
+ <dd>
+ Whether samples are signed or unsigned.
+ </dd>
+
+ <dt>Representation</dt>
+ <dd>
+ Either fixed point or floating point; see below.
+ </dd>
+
+</dl>
+
+<h2 id="fixed">Fixed point representation</h2>
+
+<p>
+<a href="http://en.wikipedia.org/wiki/Fixed-point_arithmetic">Fixed point</a>
+is the most common representation for uncompressed PCM audio data,
+especially at hardware interfaces.
+</p>
+
+<p>
+A fixed-point number has a fixed (constant) number of digits
+before and after the <a href="https://en.wikipedia.org/wiki/Radix_point">radix point</a>.
+All of our representations use
+<a href="https://en.wikipedia.org/wiki/Binary_number">base 2</a>,
+so we substitute <em>bit</em> for <em>digit</em>,
+and <em>binary point</em> or simply <em>point</em> for <em>radix point</em>.
+The bits to the left of the point are the integer part,
+and the bits to the right of the point are the
+<a href="https://en.wikipedia.org/wiki/Fractional_part">fractional part</a>.
+</p>
+
+<p>
+We speak of <em>integer PCM</em>, because fixed-point values
+are usually stored and manipulated as integer values.
+The interpretation as fixed-point is implicit.
+</p>
+
+<p>
+We use <a href="https://en.wikipedia.org/wiki/Two%27s_complement">two's complement</a>
+for all signed fixed-point representations,
+so the following holds where all values are in units of one
+<a href="https://en.wikipedia.org/wiki/Least_significant_bit">LSB</a>:
+</p>
+<pre>
+|largest negative value| = |largest positive value| + 1
+</pre>
+
+<h3 id="q">Q and U notation</h3>
+
+<p>
+There are various
+<a href="https://en.wikipedia.org/wiki/Fixed-point_arithmetic#Notation">notations</a>
+for fixed-point representation in an integer.
+We use <a href="https://en.wikipedia.org/wiki/Q_(number_format)">Q notation</a>:
+Q<em>m</em>.<em>n</em> means <em>m</em> integer bits and <em>n</em> fractional bits.
+The "Q" counts as one bit, though the value is expressed in two's complement.
+The total number of bits is <em>m</em> + <em>n</em> + 1.
+</p>
+
+<p>
+U<em>m</em>.<em>n</em> is for unsigned numbers:
+<em>m</em> integer bits and <em>n</em> fractional bits,
+and the "U" counts as zero bits.
+The total number of bits is <em>m</em> + <em>n</em>.
+</p>
+
+<p>
+The integer part may be used in the final result, or be temporary.
+In the latter case, the bits that make up the integer part are called
+<em>guard bits</em>. The guard bits permit an intermediate calculation to overflow,
+as long as the final value is within range or can be clamped to be within range.
+Note that fixed-point guard bits are at the left, while floating-point unit
+<a href="https://en.wikipedia.org/wiki/Guard_digit">guard digits</a>
+are used to reduce roundoff error and are on the right.
+</p>
+
+<h2 id="floating">Floating point representation</h2>
+
+<p>
+<a href="https://en.wikipedia.org/wiki/Floating_point">Floating point</a>
+is an alternative to fixed point, in which the location of the point can vary.
+The primary advantages of floating-point include:
+</p>
+
+<ul>
+ <li>Greater <a href="https://en.wikipedia.org/wiki/Headroom_(audio_signal_processing)">headroom</a>
+ and <a href="https://en.wikipedia.org/wiki/Dynamic_range">dynamic range</a>;
+ floating-point arithmetic tolerates exceeeding nominal ranges
+ during intermediate computation, and only clamps values at the end
+ </li>
+ <li>Support for special values such as infinities and NaN</li>
+ <li>Easier to use in many cases</li>
+</ul>
+
+<p>
+Historically, floating-point arithmetic was slower than integer or fixed-point
+arithmetic, but now it is common for floating-point to be faster,
+provided control flow decisions aren't based on the value of a computation.
+</p>
+
+<h2 id="androidFormats">Android formats for audio</h2>
+
+<p>
+The major Android formats for audio are listed in the table below:
+</p>
+
+<table>
+
+<tr>
+ <th></th>
+ <th colspan="5"><center>Notation</center></th>
+</tr>
+
+<tr>
+ <th>Property</th>
+ <th>Q0.15</th>
+ <th>Q0.7 <sup>1</sup></th>
+ <th>Q0.23</th>
+ <th>Q0.31</th>
+ <th>float</th>
+</tr>
+
+<tr>
+ <td>Container<br />bits</td>
+ <td>16</td>
+ <td>8</td>
+ <td>24 or 32 <sup>2</sup></td>
+ <td>32</td>
+ <td>32</td>
+</tr>
+
+<tr>
+ <td>Significant bits<br />including sign</td>
+ <td>16</td>
+ <td>8</td>
+ <td>24</td>
+ <td>24 or 32 <sup>2</sup></td>
+ <td>25 <sup>3</sup></td>
+</tr>
+
+<tr>
+ <td>Headroom<br />in dB</td>
+ <td>0</td>
+ <td>0</td>
+ <td>0</td>
+ <td>0</td>
+ <td>126 <sup>4</sup></td>
+</tr>
+
+<tr>
+ <td>Dynamic range<br />in dB</td>
+ <td>90</td>
+ <td>42</td>
+ <td>138</td>
+ <td>138 to 186</td>
+ <td>900 <sup>5</sup></td>
+</tr>
+
+</table>
+
+<p>
+All fixed-point formats above have a nominal range of -1.0 to +1.0 minus one LSB.
+There is one more negative value than positive value due to the
+two's complement representation.
+</p>
+
+<p>
+Footnotes:
+</p>
+
+<ol>
+
+<li>
+All formats above express signed sample values.
+The 8-bit format is commonly called "unsigned", but
+it is actually a signed value with bias of <code>0.10000000</code>.
+</li>
+
+<li>
+Q0.23 may be packed into 24 bits (three 8-bit bytes), or unpacked
+in 32 bits. If unpacked, the significant bits are either right-justified
+towards the LSB with sign extension padding towards the MSB (Q8.23),
+or left-justified towards the MSB with zero fill towards the LSB
+(Q0.31). Q0.31 theoretically permits up to 32 significant bits,
+but hardware interfaces that accept Q0.31 rarely use all the bits.
+</li>
+
+<li>
+Single-precision floating point has 23 explicit bits plus one hidden bit and sign bit,
+resulting in 25 significant bits total.
+<a href="https://en.wikipedia.org/wiki/Denormal_number">Denormal numbers</a>
+have fewer significant bits.
+</li>
+
+<li>
+Single-precision floating point can express values up to &plusmn;1.7e+38,
+which explains the large headroom.
+</li>
+
+<li>
+The dynamic range shown is for denormals up to the nominal maximum
+value &plusmn;1.0.
+Note that some architecture-specific floating point implementations such as
+<a href="https://en.wikipedia.org/wiki/ARM_architecture#NEON">NEON</a>
+don't support denormals.
+</li>
+
+</ol>
+
+<h2 id="conversions">Conversions</h2>
+
+<p>
+This section discusses
+<a href="https://en.wikipedia.org/wiki/Data_conversion">data conversions</a>
+between various representations.
+</p>
+
+<h3 id="floatConversions">Floating point conversions</h3>
+
+<p>
+To convert a value from Q<em>m</em>.<em>n</em> format to floating point:
+</p>
+
+<ol>
+ <li>Convert the value to floating point as if it were an integer (by ignoring the point).</li>
+ <li>Multiply by 2<sup>-<em>n</em></sup>.</li>
+</ol>
+
+<p>
+For example, to convert a Q4.27 internal value to floating point, use:
+</p>
+<pre>
+float = integer * (2 ^ -27)
+</pre>
+
+<p>
+Conversions from floating point to fixed point follow these rules:
+</p>
+
+<ul>
+
+<li>
+Single-precision floating point has a nominal range of &plusmn;1.0,
+but the full range for intermediate values is &plusmn;1.7e+38.
+Conversion between floating point and fixed point for external representation
+(such as output to audio devices) will consider only the nominal range, with
+clamping for values that exceed that range.
+In particular, when +1.0 is converted
+to a fixed-point format, it is clamped to +1.0 minus one LSB.
+</li>
+
+<li>
+Denormals (subnormals) and both +/- 0.0 are allowed in representation,
+but may be silently converted to 0.0 during processing.
+</li>
+
+<li>
+Infinities will either pass through operations or will be silently hard-limited
+to +/- 1.0. Generally the latter is for conversion to a fixed-point format.
+</li>
+
+<li>
+NaN behavior is undefined: a NaN may propagate as an identical NaN, or may be
+converted to a Default NaN, may be silently hard limited to +/- 1.0, or
+silently converted to 0.0, or result in an error.
+</li>
+
+</ul>
+
+<h3 id="fixedConversion">Fixed point conversions</h3>
+
+<p>
+Conversions between different Q<em>m</em>.<em>n</em> formats follow these rules:
+</p>
+
+<ul>
+
+<li>
+When <em>m</em> is increased, sign extend the integer part at left.
+</li>
+
+<li>
+When <em>m</em> is decreased, clamp the integer part.
+</li>
+
+<li>
+When <em>n</em> is increased, zero extend the fractional part at right.
+</li>
+
+<li>
+When <em>n</em> is decreased, either dither, round, or truncate the excess fractional bits at right.
+</li>
+
+</ul>
+
+<p>
+For example, to convert a Q4.27 value to Q0.15 (without dither or
+rounding), right shift the Q4.27 value by 12 bits, and clamp any results
+that exceed the 16-bit signed range. This aligns the point of the
+Q representation.
+</p>
+
+<p>To convert Q7.24 to Q7.23, do a signed divide by 2,
+or equivalently add the sign bit to the Q7.24 integer quantity, and then signed right shift by 1.
+Note that a simple signed right shift is <em>not</em> equivalent to a signed divide by 2.
+</p>
+
+<h3 id="lossyConversion">Lossy and lossless conversions</h3>
+
+<p>
+A conversion is <em>lossless</em> if it is
+<a href="https://en.wikipedia.org/wiki/Inverse_function">invertible</a>:
+a conversion from <code>A</code> to <code>B</code> to
+<code>C</code> results in <code>A = C</code>.
+Otherwise the conversion is <a href="https://en.wikipedia.org/wiki/Lossy_data_conversion">lossy</a>.
+</p>
+
+<p>
+Lossless conversions permit
+<a href="https://en.wikipedia.org/wiki/Round-trip_format_conversion">round-trip format conversion</a>.
+</p>
+
+<p>
+Conversions from fixed point representation with 25 or fewer significant bits to floating point are lossless.
+Conversions from floating point to any common fixed point representation are lossy.
+</p>
diff --git a/en/devices/audio/debugging.html b/en/devices/audio/debugging.html
new file mode 100644
index 00000000..3568f4cf
--- /dev/null
+++ b/en/devices/audio/debugging.html
@@ -0,0 +1,449 @@
+page.title=Audio Debugging
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+This article describes some tips and tricks for debugging Android audio.
+</p>
+
+<h2 id="teeSink">Tee Sink</h2>
+
+<p>
+The "tee sink" is
+an AudioFlinger debugging feature, available in custom builds only,
+for retaining a short fragment of recent audio for later analysis.
+This permits comparison between what was actually played or recorded
+vs. what was expected.
+</p>
+
+<p>
+For privacy the tee sink is disabled by default, at both compile-time and
+run-time. To use the tee sink, you will need to enable it by re-compiling,
+and also by setting a property. Be sure to disable this feature after you are
+done debugging; the tee sink should not be left enabled in production builds.
+</p>
+
+<p>
+The instructions in the remainder of this section are for Android 5.x and 6.x.
+For Android 7.x, replace <code>/data/misc/media</code> with
+<code>/data/misc/audioserver</code>.
+Additionally, you must use a userdebug or eng build.
+If you use a userdebug build, then disable verity with:</p>
+<pre>
+<code>$ adb root &amp;&amp; adb disable-verity &amp;&amp; adb reboot</code>
+</pre>
+
+<h3 id="compile">Compile-time setup</h3>
+
+<ol>
+<li><code>$ cd frameworks/av/services/audioflinger</code></li>
+<li>Edit <code>Configuration.h</code>.</li>
+<li>Uncomment <code>#define TEE_SINK</code>.</li>
+<li>Re-build <code>libaudioflinger.so</code>.</li>
+<li><code>$ adb root</code></li>
+<li><code>$ adb remount</code></li>
+<li>Push or sync the new <code>libaudioflinger.so</code> to the device's <code>/system/lib</code>.</li>
+</ol>
+
+<h3 id="runtime">Run-time setup</h3>
+
+<ol>
+<li><code>$ adb shell getprop | grep ro.debuggable</code>
+<br />Confirm that the output is: <code>[ro.debuggable]: [1]</code>
+</li>
+<li><code>$ adb shell</code></li>
+<li><code>$ ls -ld /data/misc/media</code>
+<br />
+<p>
+Confirm that the output is:
+</p>
+<pre>
+drwx------ media media ... media
+</pre>
+<br />
+<p>
+If the directory does not exist, create it as follows:
+</p>
+<pre>
+$ mkdir /data/misc/media
+$ chown media:media /data/misc/media
+</pre>
+</li>
+<li><code>$ echo af.tee=# &gt; /data/local.prop</code>
+<br />Where the <code>af.tee</code> value is a number described below.
+</li>
+<li><code>$ chmod 644 /data/local.prop</code></li>
+<li><code>$ reboot</code></li>
+</ol>
+
+<h4>Values for <code>af.tee</code> property</h4>
+
+<p>
+The value of <code>af.tee</code> is a number between 0 and 7, expressing
+the sum of several bits, one per feature.
+See the code at <code>AudioFlinger::AudioFlinger()</code> in <code>AudioFlinger.cpp</code>
+for an explanation of each bit, but briefly:
+</p>
+<ul>
+<li>1 = input</li>
+<li>2 = FastMixer output</li>
+<li>4 = per-track AudioRecord and AudioTrack</li>
+</ul>
+
+<p>
+There is no bit for deep buffer or normal mixer yet,
+but you can get similar results using "4."
+</p>
+
+<h3 id="test">Test and acquire data</h3>
+
+<ol>
+<li>Run your audio test.</li>
+<li><code>$ adb shell dumpsys media.audio_flinger</code></li>
+<li>Look for a line in dumpsys output such as this:<br />
+<code>tee copied to /data/misc/media/20131010101147_2.wav</code>
+<br />This is a PCM .wav file.
+</li>
+<li>Then <code>adb pull</code> any <code>/data/misc/media/*.wav</code> files of interest;
+note that track-specific dump filenames do not appear in the dumpsys output,
+but are still saved to <code>/data/misc/media</code> upon track closure.
+</li>
+<li>Review the dump files for privacy concerns before sharing with others.</li>
+</ol>
+
+<h4>Suggestions</h4>
+
+<p>Try these ideas for more useful results:</p>
+
+<ul>
+<li>Disable touch sounds and key clicks to reduce interruptions in test output.</li>
+<li>Maximize all volumes.</li>
+<li>Disable apps that make sound or record from microphone,
+if they are not of interest to your test.
+</li>
+<li>Track-specific dumps are only saved when the track is closed;
+you may need to force close an app in order to dump its track-specific data
+</li>
+<li>Do the <code>dumpsys</code> immediately after test;
+there is a limited amount of recording space available.</li>
+<li>To make sure you don't lose your dump files,
+upload them to your host periodically.
+Only a limited number of dump files are preserved;
+older dumps are removed after that limit is reached.</li>
+</ul>
+
+<h3 id="restore">Restore</h3>
+
+<p>
+As noted above, the tee sink feature should not be left enabled.
+Restore your build and device as follows:
+</p>
+<ol>
+<li>Revert the source code changes to <code>Configuration.h</code>.</li>
+<li>Re-build <code>libaudioflinger.so</code>.</li>
+<li>Push or sync the restored <code>libaudioflinger.so</code>
+to the device's <code>/system/lib</code>.
+</li>
+<li><code>$ adb shell</code></li>
+<li><code>$ rm /data/local.prop</code></li>
+<li><code>$ rm /data/misc/media/*.wav</code></li>
+<li><code>$ reboot</code></li>
+</ol>
+
+<h2 id="mediaLog">media.log</h2>
+
+<h3 id="alogx">ALOGx macros</h3>
+
+<p>
+The standard Java language logging API in Android SDK is
+<a href="http://developer.android.com/reference/android/util/Log.html">android.util.Log</a>.
+</p>
+
+<p>
+The corresponding C language API in Android NDK is
+<code>__android_log_print</code>
+declared in <code>&lt;android/log.h&gt;</code>.
+</p>
+
+<p>
+Within the native portion of Android framework, we
+prefer macros named <code>ALOGE</code>, <code>ALOGW</code>,
+<code>ALOGI</code>, <code>ALOGV</code>, etc. They are declared in
+<code>&lt;utils/Log.h&gt;</code>, and for the purposes of this article
+we'll collectively refer to them as <code>ALOGx</code>.
+</p>
+
+<p>
+All of these APIs are easy-to-use and well-understood, so they are pervasive
+throughout the Android platform. In particular the <code>mediaserver</code>
+process, which includes the AudioFlinger sound server, uses
+<code>ALOGx</code> extensively.
+</p>
+
+<p>
+Nevertheless, there are some limitations to <code>ALOGx</code> and friends:
+</p>
+
+<ul>
+<li>
+They are susceptible to "log spam": the log buffer is a shared resource
+so it can easily overflow due to unrelated log entries, resulting in
+missed information. The <code>ALOGV</code> variant is disabled at
+compile-time by default. But of course even it can result in log spam
+if it is enabled.
+</li>
+<li>
+The underlying kernel system calls could block, possibly resulting in
+priority inversion and consequently measurement disturbances and
+inaccuracies. This is of
+special concern to time-critical threads such as <code>FastMixer</code> and <code>FastCapture</code>.
+</li>
+<li>
+If a particular log is disabled to reduce log spam,
+then any information that would have been captured by that log is lost.
+It is not possible to enable a specific log retroactively,
+<i>after</i> it becomes clear that the log would have been interesting.
+</li>
+</ul>
+
+<h3 id="nblog">NBLOG, media.log, and MediaLogService</h3>
+
+<p>
+The <code>NBLOG</code> APIs and associated <code>media.log</code>
+process and <code>MediaLogService</code>
+service together form a newer logging system for media, and are specifically
+designed to address the issues above. We will loosely use the term
+"media.log" to refer to all three, but strictly speaking <code>NBLOG</code> is the
+C++ logging API, <code>media.log</code> is a Linux process name, and <code>MediaLogService</code>
+is an Android binder service for examining the logs.
+</p>
+
+<p>
+A <code>media.log</code> "timeline" is a series
+of log entries whose relative ordering is preserved.
+By convention, each thread should use it's own timeline.
+</p>
+
+<h3 id="benefits">Benefits</h3>
+
+<p>
+The benefits of the <code>media.log</code> system are that it:
+</p>
+<ul>
+<li>Doesn't spam the main log unless and until it is needed.</li>
+<li>Can be examined even when <code>mediaserver</code> crashes or hangs.</li>
+<li>Is non-blocking per timeline.</li>
+<li>Offers less disturbance to performance.
+(Of course no form of logging is completely non-intrusive.)
+</li>
+</ul>
+
+<h3 id="architecture">Architecture</h3>
+
+<p>
+The diagram below shows the relationship of the <code>mediaserver</code> process
+and the <code>init</code> process, before <code>media.log</code> is introduced:
+</p>
+<img src="images/medialog_before.png" alt="Architecture before media.log" id="figure1" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> Architecture before media.log
+</p>
+
+<p>
+Notable points:
+</p>
+<ul>
+<li><code>init</code> forks and execs <code>mediaserver</code>.</li>
+<li><code>init</code> detects the death of <code>mediaserver</code>, and re-forks as necessary.</li>
+<li><code>ALOGx</code> logging is not shown.</li>
+</ul>
+
+<p>
+The diagram below shows the new relationship of the components,
+after <code>media.log</code> is added to the architecture:
+</p>
+<img src="images/medialog_after.png" alt="Architecture after media.log" id="figure2" />
+<p class="img-caption">
+ <strong>Figure 2.</strong> Architecture after media.log
+</p>
+
+<p>
+Important changes:
+</p>
+
+<ul>
+
+<li>
+Clients use <code>NBLOG</code> API to construct log entries and append them to
+a circular buffer in shared memory.
+</li>
+
+<li>
+<code>MediaLogService</code> can dump the contents of the circular buffer at any time.
+</li>
+
+<li>
+The circular buffer is designed in such a way that any corruption of the
+shared memory will not crash <code>MediaLogService</code>, and it will still be able
+to dump as much of the buffer that is not affected by the corruption.
+</li>
+
+<li>
+The circular buffer is non-blocking and lock-free for both writing
+new entries and reading existing entries.
+</li>
+
+<li>
+No kernel system calls are required to write to or read from the circular buffer
+(other than optional timestamps).
+</li>
+
+</ul>
+
+<h4>Where to use</h4>
+
+<p>
+As of Android 4.4, there are only a few log points in AudioFlinger
+that use the <code>media.log</code> system. Though the new APIs are not as
+easy to use as <code>ALOGx</code>, they are not extremely difficult either.
+We encourage you to learn the new logging system for those
+occasions when it is indispensable.
+In particular, it is recommended for AudioFlinger threads that must
+run frequently, periodically, and without blocking such as the
+<code>FastMixer</code> and <code>FastCapture</code> threads.
+</p>
+
+<h3 id="how">How to use</h3>
+
+<h4>Add logs</h4>
+
+<p>
+First, you need to add logs to your code.
+</p>
+
+<p>
+In <code>FastMixer</code> and <code>FastCapture</code> threads, use code such as this:
+</p>
+<pre>
+logWriter-&gt;log("string");
+logWriter-&gt;logf("format", parameters);
+logWriter-&gt;logTimestamp();
+</pre>
+<p>
+As this <code>NBLog</code> timeline is used only by the <code>FastMixer</code> and
+<code>FastCapture</code> threads,
+there is no need for mutual exclusion.
+</p>
+
+<p>
+In other AudioFlinger threads, use <code>mNBLogWriter</code>:
+</p>
+<pre>
+mNBLogWriter-&gt;log("string");
+mNBLogWriter-&gt;logf("format", parameters);
+mNBLogWriter-&gt;logTimestamp();
+</pre>
+<p>
+For threads other than <code>FastMixer</code> and <code>FastCapture</code>,
+the thread's <code>NBLog</code> timeline can be used by both the thread itself, and
+by binder operations. <code>NBLog::Writer</code> does not provide any
+implicit mutual exclusion per timeline, so be sure that all logs occur
+within a context where the thread's mutex <code>mLock</code> is held.
+</p>
+
+<p>
+After you have added the logs, re-build AudioFlinger.
+</p>
+
+<p class="caution"><strong>Caution:</strong>
+A separate <code>NBLog::Writer</code> timeline is required per thread,
+to ensure thread safety, since timelines omit mutexes by design. If you
+want more than one thread to use the same timeline, you can protect with an
+existing mutex (as described above for <code>mLock</code>). Or you can
+use the <code>NBLog::LockedWriter</code> wrapper instead of <code>NBLog::Writer</code>.
+However, this negates a prime benefit of this API: its non-blocking
+behavior.
+</p>
+
+<p>
+The full <code>NBLog</code> API is at <code>frameworks/av/include/media/nbaio/NBLog.h</code>.
+</p>
+
+<h4>Enable media.log</h4>
+
+<p>
+<code>media.log</code> is disabled by default. It is active only when property
+<code>ro.test_harness</code> is <code>1</code>. You can enable it by:
+</p>
+
+<pre>
+$ adb root
+$ adb shell
+$ echo ro.test_harness=1 > /data/local.prop
+$ chmod 644 /data/local.prop
+$ reboot
+</pre>
+
+<p>
+The connection is lost during reboot, so:
+</p>
+<pre>
+$ adb shell
+</pre>
+
+The command <code>ps media</code> will now show two processes:
+<ul>
+<li>media.log</li>
+<li>mediaserver</li>
+</ul>
+<p>
+Note the process ID of <code>mediaserver</code> for later.
+</p>
+
+<h4>Displaying the timelines</h4>
+
+<p>
+You can manually request a log dump at any time.
+This command shows logs from all the active and recent timelines, and then clears them:
+</p>
+<pre>
+$ dumpsys media.log
+</pre>
+
+<p>
+Note that by design timelines are independent,
+and there is no facility for merging timelines.
+</p>
+
+<h4>Recovering logs after mediaserver death</h4>
+
+<p>
+Now try killing <code>mediaserver</code> process: <code>kill -9 #</code>, where # is
+the process ID you noted earlier. You should see a dump from <code>media.log</code>
+in the main <code>logcat</code>, showing all the logs leading up to the crash.
+</p>
+<pre>
+$ dumpsys media.log
+</pre>
diff --git a/en/devices/audio/images/ape_audio_tv_hdmi_tuner.png b/en/devices/audio/images/ape_audio_tv_hdmi_tuner.png
new file mode 100644
index 00000000..3a5d8324
--- /dev/null
+++ b/en/devices/audio/images/ape_audio_tv_hdmi_tuner.png
Binary files differ
diff --git a/en/devices/audio/images/ape_audio_tv_tif.png b/en/devices/audio/images/ape_audio_tv_tif.png
new file mode 100644
index 00000000..cfdf97f6
--- /dev/null
+++ b/en/devices/audio/images/ape_audio_tv_tif.png
Binary files differ
diff --git a/en/devices/audio/images/ape_audio_tv_tuner.png b/en/devices/audio/images/ape_audio_tv_tuner.png
new file mode 100644
index 00000000..96fb5441
--- /dev/null
+++ b/en/devices/audio/images/ape_audio_tv_tuner.png
Binary files differ
diff --git a/en/devices/audio/images/ape_fwk_audio.png b/en/devices/audio/images/ape_fwk_audio.png
new file mode 100644
index 00000000..9059a623
--- /dev/null
+++ b/en/devices/audio/images/ape_fwk_audio.png
Binary files differ
diff --git a/en/devices/audio/images/ape_fwk_hal_audio.png b/en/devices/audio/images/ape_fwk_hal_audio.png
new file mode 100644
index 00000000..fa6c47a6
--- /dev/null
+++ b/en/devices/audio/images/ape_fwk_hal_audio.png
Binary files differ
diff --git a/en/devices/audio/images/audio_hal.png b/en/devices/audio/images/audio_hal.png
new file mode 100644
index 00000000..273ac815
--- /dev/null
+++ b/en/devices/audio/images/audio_hal.png
Binary files differ
diff --git a/en/devices/audio/images/breadboard.jpg b/en/devices/audio/images/breadboard.jpg
new file mode 100644
index 00000000..fd0ec392
--- /dev/null
+++ b/en/devices/audio/images/breadboard.jpg
Binary files differ
diff --git a/en/devices/audio/images/dac.png b/en/devices/audio/images/dac.png
new file mode 100644
index 00000000..a13027cc
--- /dev/null
+++ b/en/devices/audio/images/dac.png
Binary files differ
diff --git a/en/devices/audio/images/display.jpg b/en/devices/audio/images/display.jpg
new file mode 100644
index 00000000..545206b0
--- /dev/null
+++ b/en/devices/audio/images/display.jpg
Binary files differ
diff --git a/en/devices/audio/images/hub.jpg b/en/devices/audio/images/hub.jpg
new file mode 100644
index 00000000..e825cdd4
--- /dev/null
+++ b/en/devices/audio/images/hub.jpg
Binary files differ
diff --git a/en/devices/audio/images/loopback_assembled.jpg b/en/devices/audio/images/loopback_assembled.jpg
new file mode 100644
index 00000000..d54fa576
--- /dev/null
+++ b/en/devices/audio/images/loopback_assembled.jpg
Binary files differ
diff --git a/en/devices/audio/images/loopback_circuit.png b/en/devices/audio/images/loopback_circuit.png
new file mode 100644
index 00000000..4c47e53a
--- /dev/null
+++ b/en/devices/audio/images/loopback_circuit.png
Binary files differ
diff --git a/en/devices/audio/images/medialog_after.png b/en/devices/audio/images/medialog_after.png
new file mode 100644
index 00000000..0c162252
--- /dev/null
+++ b/en/devices/audio/images/medialog_after.png
Binary files differ
diff --git a/en/devices/audio/images/medialog_before.png b/en/devices/audio/images/medialog_before.png
new file mode 100644
index 00000000..928d2f3c
--- /dev/null
+++ b/en/devices/audio/images/medialog_before.png
Binary files differ
diff --git a/en/devices/audio/images/otg.jpg b/en/devices/audio/images/otg.jpg
new file mode 100644
index 00000000..13a4071a
--- /dev/null
+++ b/en/devices/audio/images/otg.jpg
Binary files differ
diff --git a/en/devices/audio/images/pcb.jpg b/en/devices/audio/images/pcb.jpg
new file mode 100644
index 00000000..a32bc354
--- /dev/null
+++ b/en/devices/audio/images/pcb.jpg
Binary files differ
diff --git a/en/devices/audio/images/round_trip.png b/en/devices/audio/images/round_trip.png
new file mode 100644
index 00000000..663552c4
--- /dev/null
+++ b/en/devices/audio/images/round_trip.png
Binary files differ
diff --git a/en/devices/audio/images/round_trip_bar_graph.png b/en/devices/audio/images/round_trip_bar_graph.png
new file mode 100644
index 00000000..3dc24843
--- /dev/null
+++ b/en/devices/audio/images/round_trip_bar_graph.png
Binary files differ
diff --git a/en/devices/audio/images/round_trip_on_device.png b/en/devices/audio/images/round_trip_on_device.png
new file mode 100644
index 00000000..e1cf0a2a
--- /dev/null
+++ b/en/devices/audio/images/round_trip_on_device.png
Binary files differ
diff --git a/en/devices/audio/images/round_trip_via_headset_connector.png b/en/devices/audio/images/round_trip_via_headset_connector.png
new file mode 100644
index 00000000..5791cf51
--- /dev/null
+++ b/en/devices/audio/images/round_trip_via_headset_connector.png
Binary files differ
diff --git a/en/devices/audio/images/venn.png b/en/devices/audio/images/venn.png
new file mode 100644
index 00000000..1db4f53a
--- /dev/null
+++ b/en/devices/audio/images/venn.png
Binary files differ
diff --git a/en/devices/audio/implement-policy.html b/en/devices/audio/implement-policy.html
new file mode 100644
index 00000000..ae6ede2a
--- /dev/null
+++ b/en/devices/audio/implement-policy.html
@@ -0,0 +1,446 @@
+page.title=Configuring Audio Policies
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>Android 7.0 introduces a new audio policy configuration file format (XML) for
+describing your audio topology.</p>
+
+<p>Previous Android releases required using the
+<code>device/&lt;company&gt;/&lt;device&gt;/audio/audio_policy.conf</code>
+to declare the audio devices present on your product (you can see an example of
+this file for the Galaxy Nexus audio hardware in
+<code>device/samsung/tuna/audio/audio_policy.conf</code>). However, .conf is a
+simple proprietary format that is too limited to describe complex topologies for
+applications such as televisions and automobiles.</p>
+
+<p>Android 7.0 deprecates the <code>audio_policy.conf</code> and adds support
+for defining audio topology using an XML file format that is more
+human-readable, has a wide range of editing and parsing tools, and is flexible
+enough to describe complex audio topologies.</p>
+
+<p class="note".<strong>Note:</strong> Android 7.0 preserves support for using
+<code>audio_policy.conf</code>; this legacy format is used by default. To use
+the XML file format, include the build option <code>USE_XML_AUDIO_POLICY_CONF
+:= 1</code> in device makefile.</p>
+
+<h2 id=xml_advantages>Advantages of the XML format</h2>
+<p>As in the .conf file, the new XML file enables defining the number and types
+of output an input stream profiles, devices usable for playback and capture, and
+audio attributes. In addition, the XML format offers the following enhancements:
+</p>
+
+<ul>
+<li>Audio profiles are now structured similar to HDMI Simple Audio Descriptors
+and enable a different set of sampling rates/channel masks for each audio
+format.</li>
+<li>Explicit definitions of all possible connections between devices and
+streams. Previously, an implicit rule made it possible to interconnect all
+devices attached to the same HAL module, preventing the audio policy from
+controlling connections requested with audio patch APIs. In the XML format, the
+topology description now defines connection limitations.</li>
+<li>Support for <em>includes</em> avoids repeating standard A2DP, USB, or
+reroute submit definitions.</li>
+<li>Customizable volume curves. Previously, volume tables were hardcoded. In the
+XML format, volume tables are described and can be customized.</li>
+</ul>
+
+<p>The template at
+<code>frameworks/av/services/audiopolicy/config/audio_policy_configuration.xml</code>
+shows many of these features in use.</p>
+
+<h2 id=xml_file_format>File format and location</h2>
+<p>The new audio policy configuration file is
+<code>audio_policy_configuration.xml</code> and is located in
+<code>/system/etc</code>. To view a simple audio policy configuration in the new
+XML file format, view the example below.</p>
+
+<p>
+<div class="toggle-content closed">
+ <p><a href="#" onclick="return toggleContent(this)">
+ <img src="{@docRoot}assets/images/triangle-closed.png" class="toggle-content-img" />
+ <strong><span class="toggle-content-text">Show audio policy example</span>
+ <span class="toggle-content-text" style="display:none;">Hide audio policy
+ example</span></strong>
+ </a></p>
+
+ <div class="toggle-content-toggleme">
+<pre class="prettyprint">
+&lt;?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot; standalone=&quot;yes&quot;?&gt;
+&lt;audioPolicyConfiguration version=&quot;1.0&quot; xmlns:xi=&quot;http://www.w3.org/2001/XInclude&quot;&gt;
+ &lt;globalConfiguration speaker_drc_enabled=&quot;true&quot;/&gt;
+ &lt;modules&gt;
+ &lt;module name=&quot;primary&quot; halVersion=&quot;3.0&quot;&gt;
+ &lt;attachedDevices&gt;
+ &lt;item&gt;Speaker&lt;/item&gt;
+ &lt;item&gt;Earpiece&lt;/item&gt;
+ &lt;item&gt;Built-In Mic&lt;/item&gt;
+ &lt;/attachedDevices&gt;
+ &lt;defaultOutputDevice&gt;Speaker&lt;/defaultOutputDevice&gt;
+ &lt;mixPorts&gt;
+ &lt;mixPort name=&quot;primary output&quot; role=&quot;source&quot; flags=&quot;AUDIO_OUTPUT_FLAG_PRIMARY&quot;&gt;
+ &lt;profile name=&quot;&quot; format=&quot;AUDIO_FORMAT_PCM_16_BIT&quot;
+ samplingRates=&quot;48000&quot; channelMasks=&quot;AUDIO_CHANNEL_OUT_STEREO&quot;/&gt;
+ &lt;/mixPort&gt;
+ &lt;mixPort name=&quot;primary input&quot; role=&quot;sink&quot;&gt;
+ &lt;profile name=&quot;&quot; format=&quot;AUDIO_FORMAT_PCM_16_BIT&quot;
+ samplingRates=&quot;8000,16000,48000&quot;
+ channelMasks=&quot;AUDIO_CHANNEL_IN_MONO&quot;/&gt;
+ &lt;/mixPort&gt;
+ &lt;/mixPorts&gt;
+ &lt;devicePorts&gt;
+ &lt;devicePort tagName=&quot;Earpiece&quot; type=&quot;AUDIO_DEVICE_OUT_EARPIECE&quot; role=&quot;sink&quot;&gt;
+ &lt;profile name=&quot;&quot; format=&quot;AUDIO_FORMAT_PCM_16_BIT&quot;
+ samplingRates=&quot;48000&quot; channelMasks=&quot;AUDIO_CHANNEL_IN_MONO&quot;/&gt;
+ &lt;/devicePort&gt;
+ &lt;devicePort tagName=&quot;Speaker&quot; role=&quot;sink&quot; type=&quot;AUDIO_DEVICE_OUT_SPEAKER&quot; address=&quot;&quot;&gt;
+ &lt;profile name=&quot;&quot; format=&quot;AUDIO_FORMAT_PCM_16_BIT&quot;
+ samplingRates=&quot;48000&quot; channelMasks=&quot;AUDIO_CHANNEL_OUT_STEREO&quot;/&gt;
+ &lt;/devicePort&gt;
+ &lt;devicePort tagName=&quot;Wired Headset&quot; type=&quot;AUDIO_DEVICE_OUT_WIRED_HEADSET&quot; role=&quot;sink&quot;&gt;
+ &lt;profile name=&quot;&quot; format=&quot;AUDIO_FORMAT_PCM_16_BIT&quot;
+ samplingRates=&quot;48000&quot; channelMasks=&quot;AUDIO_CHANNEL_OUT_STEREO&quot;/&gt;
+ &lt;/devicePort&gt;
+ &lt;devicePort tagName=&quot;Built-In Mic&quot; type=&quot;AUDIO_DEVICE_IN_BUILTIN_MIC&quot; role=&quot;source&quot;&gt;
+ &lt;profile name=&quot;&quot; format=&quot;AUDIO_FORMAT_PCM_16_BIT&quot;
+ samplingRates=&quot;8000,16000,48000&quot;
+ channelMasks=&quot;AUDIO_CHANNEL_IN_MONO&quot;/&gt;
+ &lt;/devicePort&gt;
+ &lt;devicePort tagName=&quot;Wired Headset Mic&quot; type=&quot;AUDIO_DEVICE_IN_WIRED_HEADSET&quot; role=&quot;source&quot;&gt;
+ &lt;profile name=&quot;&quot; format=&quot;AUDIO_FORMAT_PCM_16_BIT&quot;
+ samplingRates=&quot;8000,16000,48000&quot;
+ channelMasks=&quot;AUDIO_CHANNEL_IN_MONO&quot;/&gt;
+ &lt;/devicePort&gt;
+ &lt;/devicePorts&gt;
+ &lt;routes&gt;
+ &lt;route type=&quot;mix&quot; sink=&quot;Earpiece&quot; sources=&quot;primary output&quot;/&gt;
+ &lt;route type=&quot;mix&quot; sink=&quot;Speaker&quot; sources=&quot;primary output&quot;/&gt;
+ &lt;route type=&quot;mix&quot; sink=&quot;Wired Headset&quot; sources=&quot;primary output&quot;/&gt;
+ &lt;route type=&quot;mix&quot; sink=&quot;primary input&quot; sources=&quot;Built-In Mic,Wired Headset Mic&quot;/&gt;
+ &lt;/routes&gt;
+ &lt;/module&gt;
+ &lt;xi:include href=&quot;a2dp_audio_policy_configuration.xml&quot;/&gt;
+ &lt;/modules&gt;
+
+ &lt;xi:include href=&quot;audio_policy_volumes.xml&quot;/&gt;
+ &lt;xi:include href=&quot;default_volume_tables.xml&quot;/&gt;
+&lt;/audioPolicyConfiguration&gt;
+</pre></div></div>
+</p>
+
+<p>The top level structure contains modules that correspond to each audio HAL
+hardware module, where each module has a list of mix ports, device ports, and
+routes:</p>
+<ul>
+<li><strong>Mix ports</strong> describe the possible config profiles for streams
+that can be opened at the audio HAL for playback and capture.</li>
+<li><strong>Device ports</strong> describe the devices that can be attached with
+their type (and optionally address and audio properties, if relevant).</li>
+<li><strong>Routes</strong> (new) is now separated from the mix port descriptor,
+enabling description of routes from device to device or stream to device.</li>
+</ul>
+
+<p>Volume tables are simple lists of points defining the curve used to translate
+form a UI index to a volume in dB. A separate include file provides default
+curves, but each curve for a given use case and device category can be
+overwritten.</p>
+
+<div class="toggle-content closed">
+ <p><a href="#" onclick="return toggleContent(this)">
+ <img src="{@docRoot}assets/images/triangle-closed.png" class="toggle-content-img" />
+ <strong><span class="toggle-content-text">Show volume table example</span>
+ <span class="toggle-content-text" style="display:none;">Hide volume table
+ example</span></strong>
+ </a></p>
+
+ <div class="toggle-content-toggleme">
+<p><pre>
+&lt;?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot;?&gt;
+&lt;volumes&gt;
+ &lt;reference name=&quot;FULL_SCALE_VOLUME_CURVE&quot;&gt;
+ &lt;point&gt;0,0&lt;/point&gt;
+ &lt;point&gt;100,0&lt;/point&gt;
+ &lt;/reference&gt;
+ &lt;reference name=&quot;SILENT_VOLUME_CURVE&quot;&gt;
+ &lt;point&gt;0,-9600&lt;/point&gt;
+ &lt;point&gt;100,-9600&lt;/point&gt;
+ &lt;/reference&gt;
+ &lt;reference name=&quot;DEFAULT_VOLUME_CURVE&quot;&gt;
+ &lt;point&gt;1,-4950&lt;/point&gt;
+ &lt;point&gt;33,-3350&lt;/point&gt;
+ &lt;point&gt;66,-1700&lt;/point&gt;
+ &lt;point&gt;100,0&lt;/point&gt;
+ &lt;/reference&gt;
+&lt;/volumes&gt;
+</pre></p></div></div>
+
+<div class="toggle-content closed">
+ <p><a href="#" onclick="return toggleContent(this)">
+ <img src="{@docRoot}assets/images/triangle-closed.png" class="toggle-content-img" />
+ <strong><span class="toggle-content-text">Show volumes example</span>
+ <span class="toggle-content-text" style="display:none;">Hide volumes
+ example</span></strong>
+ </a></p>
+
+ <div class="toggle-content-toggleme">
+<p><pre>
+&lt;?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot;?&gt;
+&lt;volumes&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_VOICE_CALL&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_VOICE_CALL&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_VOICE_CALL&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_VOICE_CALL&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_SYSTEM&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_SYSTEM&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_SYSTEM&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_SYSTEM&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_RING&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_RING&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_RING&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_RING&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot;ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_MUSIC&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_MUSIC&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot;&gt;
+ &lt;point&gt;1,-5500&lt;/point&gt;
+ &lt;point&gt;20,-4300&lt;/point&gt;
+ &lt;point&gt;86,-1200&lt;/point&gt;
+ &lt;point&gt;100,0&lt;/point&gt;
+ &lt;/volume&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_MUSIC&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_MUSIC&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_ALARM&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ALARM&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ALARM&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ALARM&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_NOTIFICATION&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_NOTIFICATION&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_NOTIFICATION&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_NOTIFICATION&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_BLUETOOTH_SCO&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_BLUETOOTH_SCO&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_BLUETOOTH_SCO&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_BLUETOOTH_SCO&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_ENFORCED_AUDIBLE&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ENFORCED_AUDIBLE&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ENFORCED_AUDIBLE&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ENFORCED_AUDIBLE&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_DTMF&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_DTMF&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_DTMF&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_DTMF&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_TTS&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;SILENT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_TTS&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_TTS&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;SILENT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_TTS&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;SILENT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_ACCESSIBILITY&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ACCESSIBILITY&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ACCESSIBILITY&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_ACCESSIBILITY&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;DEFAULT_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_REROUTING&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_REROUTING&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_REROUTING&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_REROUTING&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+
+ &lt;volume stream=&quot;AUDIO_STREAM_PATCH&quot; deviceCategory=&quot;DEVICE_CATEGORY_HEADSET&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_PATCH&quot; deviceCategory=&quot;DEVICE_CATEGORY_SPEAKER&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_PATCH&quot; deviceCategory=&quot;DEVICE_CATEGORY_EARPIECE&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+ &lt;volume stream=&quot;AUDIO_STREAM_PATCH&quot; deviceCategory=&quot;DEVICE_CATEGORY_EXT_MEDIA&quot; ref=&quot;FULL_SCALE_VOLUME_CURVE&quot;/&gt;
+&lt;/volumes&gt;
+</pre></p></div></div>
+
+<h2 id=file_inclusions>File inclusions</h2>
+<p>The XML Inclusions (XInclude) method can be used to include audio policy
+configuration information located in other XML files. All included files must
+follow the structure described above with the following restrictions:</p>
+<ul>
+<li>Files can contain only top-level elements.</li>
+<li>Files cannot contain Xinclude elements.</li>
+</ul>
+<p>Use includes to avoid copying standard Android Open Source Project (AOSP)
+audio HAL modules configuration information to all audio policy configuration
+files (which is prone to errors). A standard audio policy configuration xml file
+is provided for the following audio HALs:</p>
+<ul>
+<li><strong>A2DP:</strong> <code>a2dp_audio_policy_configuration.xml</code></li>
+<li><strong>Reroute submix:</strong> <code>rsubmix_audio_policy_configuration.xml</code></li>
+<li><strong>USB:</strong> <code>usb_audio_policy_configuration.xml</code></li>
+</ul>
+
+<h2 id=code_reorg>Audio policy code reorganization</h2>
+<p>Android 7.0 splits <code>AudioPolicyManager.cpp</code> into several modules
+to make it more maintainable and to highlight what is configurable. The new
+organization of <code>frameworks/av/services/audiopolicy</code> includes the
+following modules:</p>
+
+<table>
+<tr>
+<th>Module</th>
+<th>Description</th>
+</tr>
+
+<tr>
+<td><code>/managerdefault</code></td>
+<td>Includes the generic interfaces and behavior implementation common to all
+applications. Similar to <code>AudioPolicyManager.cpp</code> with engine
+functionality and common concepts abstracted away.</td>
+</tr>
+
+<tr>
+<td><code>/common</code></td>
+<td>Defines base classes (e.g data structures for input output audio stream
+profiles, audio device descriptors, audio patches, audio port, etc.). Previously
+defined inside <code>AudioPolicyManager.cpp</code>.</td>
+</tr>
+
+<tr>
+<td><code>/engine</code></td>
+<td><p>Implements the rules that define which device and volumes should be used for
+a given use case. It implements a standard interface with the generic part, such
+as to get the appropriate device for a given playback or capture use case, or to
+set connected devices or external state (i.e. a call state of forced usage) that
+can alter the routing decision.</p>
+<p>Available in two versions, customized and default; use build option
+<code>USE_CONFIGURABLE_AUDIO_POLICY</code> to select.</p></td>
+</tr>
+
+<tr>
+<td><code>/engineconfigurable</code></td>
+<td>Policy engine implementation that relies on parameter framework (see below).
+Configuration is based on the parameter framework and where the policy is
+defined by XML files.</td>
+</tr>
+
+<tr>
+<td><code>/enginedefault</code></td>
+<td>Policy engine implementation based on previous Android Audio Policy Manager
+implementations. This is the default and includes hard coded rules that
+correspond to current Nexus and AOSP implementations.</td>
+</tr>
+
+<tr>
+<td><code>/service</code></td>
+<td>Includes binder interfaces, threading and locking implementation with
+interface to the rest of the framework.</td>
+</tr>
+
+</table>
+
+<h2 id=policy_config>Configuration using parameter-framework</h2>
+<p>Android 7.0 reorganizes audio policy code to make it easier to understand and
+maintain while also supporting an audio policy defined entirely by configuration
+files. The reorganization and audio policy design is based on Intel's parameter
+framework, a plugin-based and rule-based framework for handling parameters.</p>
+
+<p>Using the new configurable audio policy enables vendors OEMs to:</p>
+<ul>
+<li>Describe a system's structure and its parameters in XML.</li>
+<li>Write (in C++) or reuse a backend (plugin) for accessing described
+parameters.</li>
+<li>Define (in XML or in a domain-specific language) conditions/rules upon which
+a given parameter must take a given value.</li>
+</ul>
+
+<p>AOSP includes an example of an audio policy configuration file that uses the parameter-framework at: <code>Frameworks/av/services/audiopolicy/engineconfigurable/parameter-framework/example/Settings/PolicyConfigurableDomains.xml</code>. For
+details, refer to Intel documentation on the
+<a href="https://github.com/01org/parameter-framework">parameter-framework</a>
+and
+<a href="http://01org.github.io/parameter-framework/hosting/Android_M_Configurable_Audio_Policy.pdf">Android
+Configurable Audio Policy</a>.</p>
+
+<h2 id=policy_routing_apis>Audio policy routing APIs</h2>
+<p>Android 6.0 introduced a public Enumeration and Selection API that sits on
+top of the audio patch/audio port infrastructure and allows application
+developers to indicate a preference for a specific device output or input for
+connected audio records or tracks.</p>
+<p>In Android 7.0, the Enumeration and Selection API is verified by CTS tests
+and is extended to include routing for native C/C++ (OpenSL ES) audio streams.
+The routing of native streams continues to be done in Java, with the addition of
+an <code>AudioRouting</code> interface that supersedes, combines, and deprecates
+the explicit routing methods that were specific to <code>AudioTrack</code> and
+<code>AudioRecord</code> classes.</p>
+
+<p>For details on the Enumeration and Selection API, refer to
+<a href="https://developer.android.com/ndk/guides/audio/opensl-for-android.html?hl=fi#configuration-interface">Android
+configuration interfaces</a> and <code>OpenSLES_AndroidConfiguration.h</code>.
+For details on audio routing, refer to
+<a href="https://developer.android.com/reference/android/media/AudioRouting.html">AudioRouting</a>.
+</p>
+
+<h2 id=multichannel>Multi-channel support</h2>
+
+<p>If your hardware and driver supports multichannel audio via HDMI, you can
+output the audio stream directly to the audio hardware (this bypasses the
+AudioFlinger mixer so it doesn't get downmixed to two channels.) The audio HAL
+must expose whether an output stream profile supports multichannel audio
+capabilities. If the HAL exposes its capabilities, the default policy manager
+allows multichannel playback over HDMI. For implementation details, see
+<code>device/samsung/tuna/audio/audio_hw.c</code>.</p>
+
+<p>To specify that your product contains a multichannel audio output, edit the
+audio policy configuration file to describe the multichannel output for your
+product. The following example from a Galaxy Nexus shows a <em>dynamic</em>
+channel mask, which means the audio policy manager queries the actual channel
+masks supported by the HDMI sink after connection.</p>
+
+<pre>
+audio_hw_modules {
+ primary {
+ outputs {
+ ...
+ hdmi {
+ sampling_rates 44100|48000
+ channel_masks dynamic
+ formats AUDIO_FORMAT_PCM_16_BIT
+ devices AUDIO_DEVICE_OUT_AUX_DIGITAL
+ flags AUDIO_OUTPUT_FLAG_DIRECT
+ }
+ ...
+ }
+ ...
+ }
+ ...
+}
+</pre>
+
+<p>You can also specify a static channel mask such as
+<code>AUDIO_CHANNEL_OUT_5POINT1</code>. AudioFlinger's mixer downmixes the
+content to stereo automatically when sent to an audio device that does not
+support multichannel audio.</p>
+
+<h2 id=codecs>Media codecs</h2>
+
+<p>Ensure the audio codecs your hardware and drivers support are properly
+declared for your product. For details, see
+<a href="{@docRoot}devices/media/index.html#expose">Exposing Codecs to the
+Framework</a>.</p>
diff --git a/en/devices/audio/implement-pre-processing.html b/en/devices/audio/implement-pre-processing.html
new file mode 100644
index 00000000..ab6cfa9b
--- /dev/null
+++ b/en/devices/audio/implement-pre-processing.html
@@ -0,0 +1,154 @@
+page.title=Configuring Pre-Processing Effects
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>The Android platform provides audio effects on supported devices in the
+<a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">audiofx</a>
+package, which is available for developers to access. For example, the Nexus 10
+supports the following pre-processing effects:</p>
+
+<ul>
+<li>
+<a href="http://developer.android.com/reference/android/media/audiofx/AcousticEchoCanceler.html">Acoustic
+Echo Cancellation</a></li>
+<li>
+<a href="http://developer.android.com/reference/android/media/audiofx/AutomaticGainControl.html">Automatic Gain Control</a></li>
+<li>
+<a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">Noise
+Suppression</a></li>
+</ul>
+
+<h2 id=audiosources>Pairing with AudioSources</h2>
+<p>Pre-processing effects are paired with the use case mode in which the
+pre-processing is requested. In Android app development, a use case is referred
+to as an <code>AudioSource</code>; and app developers request to use the
+<code>AudioSource</code> abstraction instead of the actual audio hardware
+device. The Android Audio Policy Manager maps an <code>AudioSource</code> to a
+given capture path configuration (device, gain, pre processing, etc.) according
+to product-specific rules. The following sources are exposed to developers:</p>
+
+<ul>
+<li><code>android.media.MediaRecorder.AudioSource.CAMCORDER</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_COMMUNICATION</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_CALL</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_DOWNLINK</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_UPLINK</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_RECOGNITION</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.MIC</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.DEFAULT</code></li>
+</ul>
+
+<p>The default pre-processing effects applied for each <code>AudioSource</code>
+are specified in the <code>/system/etc/audio_effects.conf</code> file. To
+specify your own default effects for every <code>AudioSource</code>, create a
+<code>/system/vendor/etc/audio_effects.conf</code> file and specify the
+pre-processing effects to turn on. For an example, see the implementation for
+the Nexus 10 in <code>device/samsung/manta/audio_effects.conf</code>.
+AudioEffect instances acquire and release a session when created and destroyed,
+enabling the effects (such as the Loudness Enhancer) to persist throughout the
+duration of the session.</p>
+
+<p class="warning"><strong>Warning:</strong> For the
+<code>VOICE_RECOGNITION</code> use case, do not enable the noise suppression
+pre-processing effect. It should not be turned on by default when recording from
+this audio source, and you should not enable it in your own audio_effects.conf
+file. Turning on the effect by default will cause the device to fail the
+<a href="{@docRoot}compatibility/index.html"> compatibility requirement</a>
+regardless of whether this was on by default due to configuration file , or the
+audio HAL implementation's default behavior.</p>
+
+<p>The following example enables pre-processing for the VoIP
+<code>AudioSource</code> and Camcorder <code>AudioSource</code>. By declaring
+the <code>AudioSource</code> configuration in this manner, the framework will
+automatically request from the audio HAL the use of those effects.</p>
+
+<p><pre>
+pre_processing {
+ voice_communication {
+ aec {}
+ ns {}
+ }
+ camcorder {
+ agc {}
+ }
+}
+</pre></p>
+
+<h2 id=tuning>Source tuning</h2>
+
+<p><code>AudioSource</code> tuning does not have explicit requirements on audio
+gain or audio processing with the exception of voice recognition
+(<code>VOICE_RECOGNITION</code>). Requirements for voice recognition include:</p>
+
+<ul>
+<li>Flat frequency response (+/- 3dB) from 100Hz to 4kHz</li>
+<li>Close-talk config: 90dB SPL reads RMS of 2500 (16bit samples)</li>
+<li>Level tracks linearly from -18dB to +12dB relative to 90dB SPL</li>
+<li>THD &lt; 1% (90dB SPL in 100 to 4000Hz range)</li>
+<li>Near-ultrasound requirements (for testing, see
+<a href="{@docRoot}compatibility/cts/near-ultrasound.html">Near Ultrasound
+Tests</a>):
+<ul>
+<li>Support for SUPPORT_PROPERTY_MIC_NEAR_ULTRASOUND as defined in section 7.8.3
+of the CDD.</li>
+<li>Support one or both of 44100 or 48000 sampling rates with no band-pass or
+anti-aliasing filters.</li>
+</ul></li>
+<li>Effects/pre-processing must be disabled by default</li>
+</ul>
+
+<p>Examples of tuning different effects for different sources are:</p>
+
+<ul>
+<li>Noise Suppressor
+<ul>
+<li>Tuned for wind noise suppressor for <code>CAMCORDER</code></li>
+<li>Tuned for stationary noise suppressor for <code>VOICE_COMMUNICATION</code></li>
+</ul>
+</li>
+<li>Automatic Gain Control
+<ul>
+<li>Tuned for close-talk for <code>VOICE_COMMUNICATION</code> and main phone
+mic</li>
+<li>Tuned for far-talk for <code>CAMCORDER</code></li>
+</ul>
+</li>
+</ul>
+
+<h2 id="resources">Resources</h2>
+
+<p>For more information, refer to the following resources:</p>
+
+<ul>
+<li>Android documentation for
+<a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">audiofx
+package</a></li>
+
+<li>Android documentation for
+<a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">Noise
+Suppression audio effect</a></li>
+
+<li><code>device/samsung/manta/audio_effects.conf</code> file for the Nexus 10</li>
+</ul>
diff --git a/en/devices/audio/implement-shared-library.html b/en/devices/audio/implement-shared-library.html
new file mode 100644
index 00000000..f9539a9e
--- /dev/null
+++ b/en/devices/audio/implement-shared-library.html
@@ -0,0 +1,95 @@
+page.title=Configuring a Shared Library
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<p>After creating an
+<a href="{@docRoot}devices/audio/implement-policy.html">audio policy
+configuration</a>, you must package the HAL implementation into a shared library
+and copy it to the appropriate location:</p>
+
+<ol>
+<li>Create a <code>device/&lt;company&gt;/&lt;device&gt;/audio</code>
+directory to contain your library's source files.</li>
+<li>Create an <code>Android.mk</code> file to build the shared library. Ensure
+the Makefile contains the following line:
+<br>
+<pre>
+LOCAL_MODULE := audio.primary.&lt;device&gt;
+</pre>
+<br>
+<p>Your library must be named <code>audio.primary.&lt;device&gt;.so</code>
+so Android can correctly load the library. The <code>primary</code> portion of
+this filename indicates that this shared library is for the primary audio
+hardware located on the device. The module names
+<code>audio.a2dp.&lt;device&gt;</code> and
+<code>audio.usb.&lt;device&gt;</code> are also available for Bluetooth and
+USB audio interfaces. Here is an example of an <code>Android.mk</code> from the
+Galaxy Nexus audio hardware:</p>
+<p><pre>
+LOCAL_PATH := $(call my-dir)
+
+include $(CLEAR_VARS)
+
+LOCAL_MODULE := audio.primary.tuna
+LOCAL_MODULE_RELATIVE_PATH := hw
+LOCAL_SRC_FILES := audio_hw.c ril_interface.c
+LOCAL_C_INCLUDES += \
+ external/tinyalsa/include \
+ $(call include-path-for, audio-utils) \
+ $(call include-path-for, audio-effects)
+LOCAL_SHARED_LIBRARIES := liblog libcutils libtinyalsa libaudioutils libdl
+LOCAL_MODULE_TAGS := optional
+
+include $(BUILD_SHARED_LIBRARY)
+</pre></p>
+</li>
+<br>
+<li>If your product supports low latency audio as specified by the Android CDD,
+copy the corresponding XML feature file into your product. For example, in your
+product's <code>device/&lt;company&gt;/&lt;device&gt;/device.mk</code>
+Makefile:
+<p><pre>
+PRODUCT_COPY_FILES := ...
+
+PRODUCT_COPY_FILES += \
+frameworks/native/data/etc/android.hardware.audio.low_latency.xml:system/etc/permissions/android.hardware.audio.low_latency.xml \
+</pre></p>
+</li>
+<br>
+<li>Copy the audio policy configuration file you created earlier to the
+<code>system/etc/</code> directory in your product's
+<code>device/&lt;company&gt;/&lt;device&gt;/device.mk</code> Makefile.
+For example:
+<p><pre>
+PRODUCT_COPY_FILES += \
+ device/samsung/tuna/audio/audio_policy.conf:system/etc/audio_policy.conf
+</pre></p>
+</li>
+<br>
+<li>Declare the shared modules of your audio HAL that are required by your
+product in the product's
+<code>device/&lt;company&gt;/&lt;device&gt;/device.mk</code> Makefile.
+For example, the Galaxy Nexus requires the primary and Bluetooth audio HAL
+modules:
+<pre>
+PRODUCT_PACKAGES += \
+ audio.primary.tuna \
+ audio.a2dp.default
+</pre>
+</li>
+</ol>
diff --git a/en/devices/audio/implement.html b/en/devices/audio/implement.html
new file mode 100644
index 00000000..31e795b0
--- /dev/null
+++ b/en/devices/audio/implement.html
@@ -0,0 +1,69 @@
+page.title=Audio Implementation
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>This section explains how to implement the audio Hardware Abstraction Layer
+(HAL), provides details about configuring an audio policy (file formats, code
+organization, pre-processing effects), and describes how to configure the shared
+library (creating the <code>Android.mk</code> file).</p>
+
+<h2 id=implementing>Implementing the audio HAL</h2>
+
+<p>The audio HAL is composed of the following interfaces:</p>
+
+<ul>
+<li><code>hardware/libhardware/include/hardware/audio.h</code>. Represents the
+main functions of an audio device.</li>
+<li><code>hardware/libhardware/include/hardware/audio_effect.h</code>.
+Represents effects that can be applied to audio such as downmixing, echo, or
+noise suppression.</li>
+</ul>
+
+<p>You must implement all interfaces.</p>
+
+<h2 id=headers>Audio header files</h2>
+<p>For a reference of the properties you can define, refer to the audio header
+files:</p>
+
+<ul>
+<li>In Android 6.0 and higher, see
+<code>system/media/audio/include/system/audio.h</code>.</li>
+<li>In Android 5.1 and lower, see
+<code>system/core/include/system/audio.h</code>.</li>
+</ul>
+
+<p>For an example, refer to the implementation for the Galaxy Nexus at
+<code>device/samsung/tuna/audio</code>.</p>
+
+<h2 id=next-steps>Next steps</h2>
+
+<p>In addition to implementing the audio HAL, you must also create an
+<a href="{@docRoot}devices/audio/implement-policy.html">audio policy
+configuration file</a> that describes your audio topology and package the HAL
+implementation into a
+<a href="{@docRoot}devices/audio/implement-shared-library.html">shared
+library</a>. You can also configure
+<a href="{@docRoot}devices/audio/implement-pre-processing.html">pre-processing
+effects</a> such as automatic gain control and noise suppression.</p>
diff --git a/en/devices/audio/index.html b/en/devices/audio/index.html
new file mode 100644
index 00000000..82a3886d
--- /dev/null
+++ b/en/devices/audio/index.html
@@ -0,0 +1,122 @@
+page.title=Audio
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<img style="float: right; margin: 0px 15px 15px 15px;"
+src="images/ape_fwk_hal_audio.png" alt="Android Audio HAL icon"/>
+
+<p>
+Android's audio Hardware Abstraction Layer (HAL) connects the higher-level,
+audio-specific framework APIs in <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a> to the underlying audio driver and
+hardware. This section includes implementation instructions and tips for
+improving performance.
+</p>
+
+<h2 id="Architecture">Audio Architecture</h2>
+<p>
+Android audio architecture defines how audio functionality is implemented and
+points to the relevant source code involved in the implementation.
+</p>
+
+<img src="images/ape_fwk_audio.png" alt="Audio architecture" id="figure1" />
+
+<p class="img-caption">
+<strong>Figure 1.</strong> Android audio architecture
+</p>
+
+<dl>
+
+<dt>
+Application framework
+</dt>
+<dd>
+The application framework includes the app code, which uses the <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a> APIs to
+interact with audio hardware. Internally, this code calls corresponding JNI glue
+classes to access the native code that interacts with audio hardware.
+</dd>
+
+<dt>
+JNI
+</dt>
+<dd>
+The JNI code associated with <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a> calls lower level native code to access audio
+hardware. JNI is located in <code>frameworks/base/core/jni/</code> and
+<code>frameworks/base/media/jni</code>.
+</dd>
+
+<dt>
+Native framework
+</dt>
+<dd>
+The native framework provides a native equivalent to the <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a> package, calling
+Binder IPC proxies to access the audio-specific services of the media server.
+Native framework code is located in <code>frameworks/av/media/libmedia</code>.
+</dd>
+
+<dt>
+Binder IPC
+</dt>
+<dd>
+Binder IPC proxies facilitate communication over process boundaries. Proxies are
+located in <code>frameworks/av/media/libmedia</code> and begin with the letter
+"I".
+</dd>
+
+<dt>
+Media server
+</dt>
+<dd>
+The media server contains audio services, which are the actual code that
+interacts with your HAL implementations. The media server is located in
+<code>frameworks/av/services/audioflinger</code>.
+</dd>
+
+<dt>
+HAL
+</dt>
+<dd>
+The HAL defines the standard interface that audio services call into and that
+you must implement for your audio hardware to function correctly. The audio HAL
+interfaces are located in <code>hardware/libhardware/include/hardware</code>.
+For details, see <a
+href="{@docRoot}devices/halref/audio_8h_source.html">hardware/audio.h</a>.
+</dd>
+
+<dt>
+Kernel driver
+</dt>
+<dd>
+The audio driver interacts with your hardware and HAL implementation. You can
+use Advanced Linux Sound Architecture (ALSA), Open Sound System (OSS), or a
+custom driver (HAL is driver-agnostic).
+<p class="note"><strong>Note</strong>: If you use ALSA, we recommend
+<code>external/tinyalsa</code> for the user portion of the driver because of its
+compatible licensing (the standard user-mode library is GPL-licensed).</p>
+</dd>
+
+<dt>
+Android native audio based on Open SL ES <em>(not shown)</em>
+</dt>
+<dd>
+This API is exposed as part of
+<a href="https://developer.android.com/tools/sdk/ndk/index.html">Android NDK</a>
+and is at the same architecture level as
+<a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a>.
+</dd>
+
+</dl>
diff --git a/en/devices/audio/latency.html b/en/devices/audio/latency.html
new file mode 100644
index 00000000..a45bf203
--- /dev/null
+++ b/en/devices/audio/latency.html
@@ -0,0 +1,57 @@
+page.title=Audio Latency
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<p>Audio latency is the time delay as an audio signal passes through a system.
+</p>
+
+<h3 id="resources">Resources</h3>
+
+<table>
+<tr>
+ <th>Topic</th>
+ <th>Links</th>
+</tr>
+<tr>
+ <td>Description of audio latency for purposes of Android compatibility</td>
+ <td><a href="{@docRoot}compatibility/android-cdd.pdf">Android CDD</a><br /><em>section 5.5 Audio Latency</em></td>
+</tr>
+<tr>
+ <td>Common causes of audio latency</td>
+ <td><a href="latency_contrib.html">Contributors to Audio Latency</a></td>
+</tr>
+<tr>
+ <td>Android's audio latency-reduction efforts</td>
+ <td><a href="latency_design.html">Design For Reduced Latency</a></td>
+</tr>
+<tr>
+ <td>Techniques to measure audio latency</td>
+ <td>
+ <a href="latency_measure.html">Measuring Audio Latency</a><br />
+ <a href="testing_circuit.html">Light Testing Circuit</a><br />
+ <a href="loopback.html">Audio Loopback Dongle</a>
+ </td>
+</tr>
+<tr>
+ <td>Round-trip audio latency results</td>
+ <td><a href="latency_measurements.html">Audio Latency Measurements</a></td>
+</tr>
+<tr>
+ <td>Applications</td>
+ <td><a href="latency_app.html">Audio Latency for App Developers</a></td>
+</tr>
+</table>
diff --git a/en/devices/audio/latency_app.html b/en/devices/audio/latency_app.html
new file mode 100644
index 00000000..9505f9b0
--- /dev/null
+++ b/en/devices/audio/latency_app.html
@@ -0,0 +1,180 @@
+page.title=Audio Latency for App Developers
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>For the lowest audio latency possible, we recommend you use Android native audio
+based on OpenSL ES 1.0.1.</p>
+
+<h2 id="implementation">Implementation checklist</h2>
+
+<p>To use Android native audio:</p>
+
+<ol>
+
+<li>
+Download and install the
+<a href="https://developer.android.com/tools/sdk/ndk/index.html">Android NDK</a>.
+In the rest of this document, we'll assume <code>NDKroot</code> is the
+directory where you installed NDK.
+</li>
+
+<li>
+Read the <a href="#supporting">supporting documentation.</a>
+</li>
+
+<li>
+Check for API level 9 or higher.
+</li>
+
+<li>
+Check for feature
+<a href="http://developer.android.com/guide/topics/manifest/uses-feature-element.html#hw-features">android.hardware.audio.low_latency.</a>
+</li>
+
+<li>
+Use the recommended native buffer size and sample rate returned by
+<a href="http://developer.android.com/reference/android/media/AudioManager.html#getProperty(java.lang.String)">android.media.AudioManager.getProperty(java.lang.String)</a>
+<p> <strong>Note</strong>: the same buffer size and sample rate should also be used for input.</p>
+</li>
+
+<li>
+Usually an OpenSL ES buffer count of 1 is sufficient.
+</li>
+
+<li>
+Keep your callback handlers short, without bursty CPU usage or unbounded blocking. Avoid
+<a href="avoiding_pi.html">priority inversion.</a>
+</li>
+
+<li>
+Consider using
+<a href="avoiding_pi.html#nonBlockingAlgorithms">non-blocking algorithms</a>
+to communicate between input and output callback handlers,
+and between the callback handlers and the rest of your application.
+</li>
+
+</ol>
+
+<h2 id="supporting">Supporting documentation</h2>
+
+<h3 id="opensl_es_1_0_1">OpenSL ES 1.0.1</h3>
+
+<p>
+Use a PDF viewer to review the
+<a href="https://www.khronos.org/registry/sles/specs/OpenSL_ES_Specification_1.0.1.pdf">OpenSL 1.0.1 Specification.</a>
+This is a rather long reference, and not all of it will be relevant to you; but you
+will need to consult it for details on the API.
+</p>
+
+<p class="note">
+<strong>Note</strong>: this document describes the full OpenSL ES 1.0.1, but Android
+native audio is actually based on a subset of OpenSL ES 1.0.1 with some Android-specific extensions.
+</p>
+
+<p>
+Documents describing later versions of OpenSL ES, such as 1.1,
+are not relevant to Android.
+</p>
+
+<h3 id="opensl_es_for_android">OpenSL ES for Android</h3>
+
+<p>
+The document "OpenSL ES for Android" is provided in the NDK installation,
+and is not currently available online. Open this link in a browser:
+</p>
+
+<pre>
+NDKroot/docs/Additional_library_docs/opensles/index.html
+</pre>
+
+<p>
+You'll want to skim the whole
+document, but pay special attention to the "Performance" subsection of the
+"Programming notes" section.
+</p>
+
+<p>
+Section "Supported features from OpenSL ES 1.0.1"
+describes the subset supported by Android.
+</p>
+
+<p>
+Section "Android extensions" describes Android-specific extensions
+that aren't included in base OpenSL ES 1.0.1.
+</p>
+
+<h3 id="relationship">Relationship with OpenSL ES 1.0.1</h3>
+
+<p>
+This Venn diagram shows the relationship between
+Android native audio and OpenSL ES 1.0.1.
+</p>
+
+<img src="images/venn.png" alt="Venn diagram" id="figure1" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> Venn diagram
+</p>
+
+<h2 id="resources">Other resources</h2>
+
+<h3 id="source_android_com">source.android.com</h3>
+
+<p>
+The site <a href="{@docRoot}">source.android.com</a>
+is primarily designed for OEMs building Android
+devices, and the SoC vendors who supply components to these OEMs.
+</p>
+
+<p>
+However, there is a wealth of useful information about latency at this site, so
+you may want to review it. See the articles at
+<a href="latency.html">Audio Latency.</a>
+</p>
+
+<h3 id="android_ndk">android-ndk</h3>
+
+<p>
+If you have questions about how to use Android native audio, you can ask at the discussion group
+<a href="https://groups.google.com/forum/#!forum/android-ndk">android-ndk.</a>
+</p>
+
+<h3 id="videos">Videos</h3>
+
+<dl>
+
+<dt><a href="https://youtu.be/d3kfEeMZ65c">High performance audio on Android</a>
+(Google I/O 2013)</dt>
+<dd>The whole video is about latency.</dd>
+
+<dt><a href="https://youtu.be/92fgcUNCHic">Building great multi-media experiences on Android</a>
+(Google I/O 2014)</dt>
+<dd>The first 14 minutes are about audio in general and input latency in particular.</dd>
+
+<dt><a href="https://youtu.be/PnDK17zP9BI">Audio latency: buffer sizes</a>
+(100 Days of Google Dev)</dt>
+<dd>Describes the relationship between audio latency, buffer sizes, and task scheduling.</dd>
+
+</dl>
diff --git a/en/devices/audio/latency_contrib.html b/en/devices/audio/latency_contrib.html
new file mode 100644
index 00000000..2969ba20
--- /dev/null
+++ b/en/devices/audio/latency_contrib.html
@@ -0,0 +1,220 @@
+page.title=Contributors to Audio Latency
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+ This page focuses on the contributors to output latency,
+ but a similar discussion applies to input latency.
+</p>
+<p>
+ Assuming the analog circuitry does not contribute significantly, then the major
+ surface-level contributors to audio latency are the following:
+</p>
+
+<ul>
+ <li>Application</li>
+ <li>Total number of buffers in pipeline</li>
+ <li>Size of each buffer, in frames</li>
+ <li>Additional latency after the app processor, such as from a DSP</li>
+</ul>
+
+<p>
+ As accurate as the above list of contributors may be, it is also misleading.
+ The reason is that buffer count and buffer size are more of an
+ <em>effect</em> than a <em>cause</em>. What usually happens is that
+ a given buffer scheme is implemented and tested, but during testing, an audio
+ underrun or overrun is heard as a "click" or "pop." To compensate, the
+ system designer then increases buffer sizes or buffer counts.
+ This has the desired result of eliminating the underruns or overruns, but it also
+ has the undesired side effect of increasing latency.
+ For more information about buffer sizes, see the video
+ <a href="https://youtu.be/PnDK17zP9BI">Audio latency: buffer sizes</a>.
+
+</p>
+
+<p>
+ A better approach is to understand the causes of the
+ underruns and overruns, and then correct those. This eliminates the
+ audible artifacts and may permit even smaller or fewer buffers
+ and thus reduce latency.
+</p>
+
+<p>
+ In our experience, the most common causes of underruns and overruns include:
+</p>
+<ul>
+ <li>Linux CFS (Completely Fair Scheduler)</li>
+ <li>high-priority threads with SCHED_FIFO scheduling</li>
+ <li>priority inversion</li>
+ <li>long scheduling latency</li>
+ <li>long-running interrupt handlers</li>
+ <li>long interrupt disable time</li>
+ <li>power management</li>
+ <li>security kernels</li>
+</ul>
+
+<h3 id="linuxCfs">Linux CFS and SCHED_FIFO scheduling</h3>
+<p>
+ The Linux CFS is designed to be fair to competing workloads sharing a common CPU
+ resource. This fairness is represented by a per-thread <em>nice</em> parameter.
+ The nice value ranges from -19 (least nice, or most CPU time allocated)
+ to 20 (nicest, or least CPU time allocated). In general, all threads with a given
+ nice value receive approximately equal CPU time and threads with a
+ numerically lower nice value should expect to
+ receive more CPU time. However, CFS is "fair" only over relatively long
+ periods of observation. Over short-term observation windows,
+ CFS may allocate the CPU resource in unexpected ways. For example, it
+ may take the CPU away from a thread with numerically low niceness
+ onto a thread with a numerically high niceness. In the case of audio,
+ this can result in an underrun or overrun.
+</p>
+
+<p>
+ The obvious solution is to avoid CFS for high-performance audio
+ threads. Beginning with Android 4.1, such threads now use the
+ <code>SCHED_FIFO</code> scheduling policy rather than the <code>SCHED_NORMAL</code> (also called
+ <code>SCHED_OTHER</code>) scheduling policy implemented by CFS.
+</p>
+
+<h3 id="schedFifo">SCHED_FIFO priorities</h3>
+<p>
+ Though the high-performance audio threads now use <code>SCHED_FIFO</code>, they
+ are still susceptible to other higher priority <code>SCHED_FIFO</code> threads.
+ These are typically kernel worker threads, but there may also be a few
+ non-audio user threads with policy <code>SCHED_FIFO</code>. The available <code>SCHED_FIFO</code>
+ priorities range from 1 to 99. The audio threads run at priority
+ 2 or 3. This leaves priority 1 available for lower priority threads,
+ and priorities 4 to 99 for higher priority threads. We recommend
+ you use priority 1 whenever possible, and reserve priorities 4 to 99 for
+ those threads that are guaranteed to complete within a bounded amount
+ of time, execute with a period shorter than the period of audio threads,
+ and are known to not interfere with scheduling of audio threads.
+</p>
+
+<h3 id="rms">Rate-monotonic scheduling</h3>
+<p>
+ For more information on the theory of assignment of fixed priorities,
+ see the Wikipedia article
+ <a href="http://en.wikipedia.org/wiki/Rate-monotonic_scheduling">Rate-monotonic scheduling</a> (RMS).
+ A key point is that fixed priorities should be allocated strictly based on period,
+ with higher priorities assigned to threads of shorter periods, not based on perceived "importance."
+ Non-periodic threads may be modeled as periodic threads, using the maximum frequency of execution
+ and maximum computation per execution. If a non-periodic thread cannot be modeled as
+ a periodic thread (for example it could execute with unbounded frequency or unbounded computation
+ per execution), then it should not be assigned a fixed priority as that would be incompatible
+ with the scheduling of true periodic threads.
+</p>
+
+<h3 id="priorityInversion">Priority inversion</h3>
+<p>
+ <a href="http://en.wikipedia.org/wiki/Priority_inversion">Priority inversion</a>
+ is a classic failure mode of real-time systems,
+ where a higher-priority task is blocked for an unbounded time waiting
+ for a lower-priority task to release a resource such as (shared
+ state protected by) a
+ <a href="http://en.wikipedia.org/wiki/Mutual_exclusion">mutex</a>.
+ See the article "<a href="avoiding_pi.html">Avoiding priority inversion</a>" for techniques to
+ mitigate it.
+</p>
+
+<h3 id="schedLatency">Scheduling latency</h3>
+<p>
+ Scheduling latency is the time between when a thread becomes
+ ready to run and when the resulting context switch completes so that the
+ thread actually runs on a CPU. The shorter the latency the better, and
+ anything over two milliseconds causes problems for audio. Long scheduling
+ latency is most likely to occur during mode transitions, such as
+ bringing up or shutting down a CPU, switching between a security kernel
+ and the normal kernel, switching from full power to low-power mode,
+ or adjusting the CPU clock frequency and voltage.
+</p>
+
+<h3 id="interrupts">Interrupts</h3>
+<p>
+ In many designs, CPU 0 services all external interrupts. So a
+ long-running interrupt handler may delay other interrupts, in particular
+ audio direct memory access (DMA) completion interrupts. Design interrupt handlers
+ to finish quickly and defer lengthy work to a thread (preferably
+ a CFS thread or <code>SCHED_FIFO</code> thread of priority 1).
+</p>
+
+<p>
+ Equivalently, disabling interrupts on CPU 0 for a long period
+ has the same result of delaying the servicing of audio interrupts.
+ Long interrupt disable times typically happen while waiting for a kernel
+ <i>spin lock</i>. Review these spin locks to ensure they are bounded.
+</p>
+
+<h3 id="power">Power, performance, and thermal management</h3>
+<p>
+ <a href="http://en.wikipedia.org/wiki/Power_management">Power management</a>
+ is a broad term that encompasses efforts to monitor
+ and reduce power consumption while optimizing performance.
+ <a href="http://en.wikipedia.org/wiki/Thermal_management_of_electronic_devices_and_systems">Thermal management</a>
+ and <a href="http://en.wikipedia.org/wiki/Computer_cooling">computer cooling</a>
+ are similar but seek to measure and control heat to avoid damage due to excess heat.
+ In the Linux kernel, the CPU
+ <a href="http://en.wikipedia.org/wiki/Governor_%28device%29">governor</a>
+ is responsible for low-level policy, while user mode configures high-level policy.
+ Techniques used include:
+</p>
+
+<ul>
+ <li>dynamic voltage scaling</li>
+ <li>dynamic frequency scaling</li>
+ <li>dynamic core enabling</li>
+ <li>cluster switching</li>
+ <li>power gating</li>
+ <li>hotplug (hotswap)</li>
+ <li>various sleep modes (halt, stop, idle, suspend, etc.)</li>
+ <li>process migration</li>
+ <li><a href="http://en.wikipedia.org/wiki/Processor_affinity">processor affinity</a></li>
+</ul>
+
+<p>
+ Some management operations can result in "work stoppages" or
+ times during which there is no useful work performed by the application processor.
+ These work stoppages can interfere with audio, so such management should be designed
+ for an acceptable worst-case work stoppage while audio is active.
+ Of course, when thermal runaway is imminent, avoiding permanent damage
+ is more important than audio!
+</p>
+
+<h3 id="security">Security kernels</h3>
+<p>
+ A <a href="http://en.wikipedia.org/wiki/Security_kernel">security kernel</a> for
+ <a href="http://en.wikipedia.org/wiki/Digital_rights_management">Digital rights management</a>
+ (DRM) may run on the same application processor core(s) as those used
+ for the main operating system kernel and application code. Any time
+ during which a security kernel operation is active on a core is effectively a
+ stoppage of ordinary work that would normally run on that core.
+ In particular, this may include audio work. By its nature, the internal
+ behavior of a security kernel is inscrutable from higher-level layers, and thus
+ any performance anomalies caused by a security kernel are especially
+ pernicious. For example, security kernel operations do not typically appear in
+ context switch traces. We call this "dark time" &mdash; time that elapses
+ yet cannot be observed. Security kernels should be designed for an
+ acceptable worst-case work stoppage while audio is active.
+</p>
diff --git a/en/devices/audio/latency_design.html b/en/devices/audio/latency_design.html
new file mode 100644
index 00000000..c931fba2
--- /dev/null
+++ b/en/devices/audio/latency_design.html
@@ -0,0 +1,236 @@
+page.title=Design For Reduced Latency
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+The Android 4.1 release introduced internal framework changes for
+a <a href="http://en.wikipedia.org/wiki/Low_latency">lower latency</a>
+audio output path. There were minimal public client API
+or HAL API changes. This document describes the initial design,
+which has continued to evolve over time.
+Having a good understanding of this design should help device OEM and
+SoC vendors implement the design correctly on their particular devices
+and chipsets. This article is not intended for application developers.
+</p>
+
+<h2 id="trackCreation">Track Creation</h2>
+
+<p>
+The client can optionally set bit <code>AUDIO_OUTPUT_FLAG_FAST</code> in the
+<code>audio_output_flags_t</code> parameter of AudioTrack C++ constructor or
+<code>AudioTrack::set()</code>. Currently the only clients that do so are:
+</p>
+
+<ul>
+<li>Android native audio based on OpenSL ES</li>
+<li><a href="http://developer.android.com/reference/android/media/SoundPool.html">android.media.SoundPool</a></li>
+<li><a href="http://developer.android.com/reference/android/media/ToneGenerator.html">android.media.ToneGenerator</a></li>
+</ul>
+
+<p>
+The AudioTrack C++ implementation reviews the <code>AUDIO_OUTPUT_FLAG_FAST</code>
+request and may optionally deny the request at client level. If it
+decides to pass the request on, it does so using <code>TRACK_FAST</code> bit of
+the <code>track_flags_t</code> parameter of the <code>IAudioTrack</code> factory method
+<code>IAudioFlinger::createTrack()</code>.
+</p>
+
+<p>
+The AudioFlinger audio server reviews the <code>TRACK_FAST</code> request and may
+optionally deny the request at server level. It informs the client
+whether or not the request was accepted, via bit <code>CBLK_FAST</code> of the
+shared memory control block.
+</p>
+
+<p>
+The factors that impact the decision include:
+</p>
+
+<ul>
+<li>Presence of a fast mixer thread for this output (see below)</li>
+<li>Track sample rate</li>
+<li>Presence of a client thread to execute callback handlers for this track</li>
+<li>Track buffer size</li>
+<li>Available fast track slots (see below)</li>
+</ul>
+
+<p>
+If the client's request was accepted, it is called a "fast track."
+Otherwise it's called a "normal track."
+</p>
+
+<h2 id="mixerThreads">Mixer Threads</h2>
+
+<p>
+At the time AudioFlinger creates a normal mixer thread, it decides
+whether or not to also create a fast mixer thread. Both the normal
+mixer and fast mixer are not associated with a particular track,
+but rather with a set of tracks. There is always a normal mixer
+thread. The fast mixer thread, if it exists, is subservient to the
+normal mixer thread and acts under its control.
+</p>
+
+<h3 id="fastMixer">Fast mixer</h3>
+
+<h4>Features</h4>
+
+<p>
+The fast mixer thread provides these features:
+</p>
+
+<ul>
+<li>Mixing of the normal mixer's sub-mix and up to 7 client fast tracks</li>
+<li>Per track attenuation</li>
+</ul>
+
+<p>
+Omitted features:
+</p>
+
+<ul>
+<li>Per track sample rate conversion</li>
+<li>Per track effects</li>
+<li>Per mix effects</li>
+</ul>
+
+<h4>Period</h4>
+
+<p>
+The fast mixer runs periodically, with a recommended period of two
+to three milliseconds (ms), or a slightly higher period of five ms if needed for scheduling stability.
+This number was chosen so that, accounting for the complete
+buffer pipeline, the total latency is on the order of 10 ms. Smaller
+values are possible but may result in increased power consumption
+and chance of glitches depending on CPU scheduling predictability.
+Larger values are possible, up to 20 ms, but result in degraded
+total latency and so should be avoided.
+</p>
+
+<h4>Scheduling</h4>
+
+<p>
+The fast mixer runs at elevated <code>SCHED_FIFO</code> priority. It needs very
+little CPU time, but must run often and with low scheduling jitter.
+<a href="http://en.wikipedia.org/wiki/Jitter">Jitter</a>
+expresses the variation in cycle time: it is the difference between the
+actual cycle time versus the expected cycle time.
+Running too late will result in glitches due to underrun. Running
+too early will result in glitches due to pulling from a fast track
+before the track has provided data.
+</p>
+
+<h4>Blocking</h4>
+
+<p>
+Ideally the fast mixer thread never blocks, other than at HAL
+<code>write()</code>. Other occurrences of blocking within the fast mixer are
+considered bugs. In particular, mutexes are avoided.
+Instead, <a href="http://en.wikipedia.org/wiki/Non-blocking_algorithm">non-blocking algorithms</a>
+(also known as lock-free algorithms) are used.
+See <a href="avoiding_pi.html">Avoiding Priority Inversion</a> for more on this topic.
+</p>
+
+<h4>Relationship to other components</h4>
+
+<p>
+The fast mixer has little direct interaction with clients. In
+particular, it does not see binder-level operations, but it does
+access the client's shared memory control block.
+</p>
+
+<p>
+The fast mixer receives commands from the normal mixer via a state queue.
+</p>
+
+<p>
+Other than pulling track data, interaction with clients is via the normal mixer.
+</p>
+
+<p>
+The fast mixer's primary sink is the audio HAL.
+</p>
+
+<h3 id="normalMixer">Normal mixer</h3>
+
+<h4>Features</h4>
+
+<p>
+All features are enabled:
+</p>
+
+<ul>
+<li>Up to 32 tracks</li>
+<li>Per track attenuation</li>
+<li>Per track sample rate conversion</li>
+<li>Effects processing</li>
+</ul>
+
+<h4>Period</h4>
+
+<p>
+The period is computed to be the first integral multiple of the
+fast mixer period that is >= 20 ms.
+</p>
+
+<h4>Scheduling</h4>
+
+<p>
+The normal mixer runs at elevated <code>SCHED_OTHER</code> priority.
+</p>
+
+<h4>Blocking</h4>
+
+<p>
+The normal mixer is permitted to block, and often does so at various
+mutexes as well as at a blocking pipe to write its sub-mix.
+</p>
+
+<h4>Relationship to other components</h4>
+
+<p>
+The normal mixer interacts extensively with the outside world,
+including binder threads, audio policy manager, fast mixer thread,
+and client tracks.
+</p>
+
+<p>
+The normal mixer's sink is a blocking pipe to the fast mixer's track 0.
+</p>
+
+<h2 id="flags">Flags</h2>
+
+<p>
+<code>AUDIO_OUTPUT_FLAG_FAST</code> bit is a hint. There's no guarantee the
+request will be fulfilled.
+</p>
+
+<p>
+<code>AUDIO_OUTPUT_FLAG_FAST</code> is a client-level concept. It does not appear
+in server.
+</p>
+
+<p>
+<code>TRACK_FAST</code> is a client -&gt; server concept.
+</p>
diff --git a/en/devices/audio/latency_measure.html b/en/devices/audio/latency_measure.html
new file mode 100644
index 00000000..cf974bd2
--- /dev/null
+++ b/en/devices/audio/latency_measure.html
@@ -0,0 +1,239 @@
+page.title=Measuring Audio Latency
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+ This page describes common methods for measuring input and output latency.
+</p>
+
+
+
+<h2 id="measuringOutput">Measuring Output Latency</h2>
+
+<p>
+ There are several techniques available to measure output latency,
+ with varying degrees of accuracy and ease of running, described below. Also
+see the <a href="testing_circuit.html">Testing circuit</a> for an example test environment.
+</p>
+
+<h3 id="ledTest">LED and oscilloscope test</h3>
+<p>
+This test measures latency in relation to the device's LED indicator.
+If your production device does not have an LED, you can install the
+ LED on a prototype form factor device. For even better accuracy
+ on prototype devices with exposed circuity, connect one
+ oscilloscope probe to the LED directly to bypass the light
+ sensor latency.
+ </p>
+
+<p>
+ If you cannot install an LED on either your production or prototype device,
+ try the following workarounds:
+</p>
+
+<ul>
+ <li>Use a General Purpose Input/Output (GPIO) pin for the same purpose.</li>
+ <li>Use JTAG or another debugging port.</li>
+ <li>Use the screen backlight. This might be risky as the
+ backlight may have a non-negligible latency, and can contribute to
+ an inaccurate latency reading.
+ </li>
+</ul>
+
+<p>To conduct this test:</p>
+
+<ol>
+ <li>Run an app that periodically pulses the LED at
+ the same time it outputs audio.
+ <p class="note"><strong>Note:</strong> To get useful results, it is crucial to use the correct
+ APIs in the test app so that you're exercising the fast audio output path.
+ See <a href="latency_design.html">Design For Reduced Latency</a> for
+ background.</p>
+ </li>
+ <li>Place a light sensor next to the LED.</li>
+ <li>Connect the probes of a dual-channel oscilloscope to both the wired headphone
+ jack (line output) and light sensor.</li>
+ <li>Use the oscilloscope to measure
+ the time difference between observing the line output signal versus the light
+ sensor signal.</li>
+</ol>
+
+ <p>The difference in time is the approximate audio output latency,
+ assuming that the LED latency and light sensor latency are both zero.
+ Typically, the LED and light sensor each have a relatively low latency
+ on the order of one millisecond or less, which is sufficiently low enough
+ to ignore.</p>
+
+<h2 id="measuringRoundTrip">Measuring Round-Trip Latency</h2>
+
+<p>
+ <a href="http://en.wikipedia.org/wiki/Round-trip_delay_time">Round-trip latency</a>
+ is the sum of output latency and input latency.
+</p>
+
+<h3 id="larsenTest">Larsen test</h3>
+<p>
+ One of the easiest latency tests is an audio feedback
+ (Larsen effect) test. This provides a crude measure of combined output
+ and input latency by timing an impulse response loop. This test is not very useful
+ for detailed analysis
+ by itself because of the nature of the test, but it can be useful for
+ calibrating other tests, and for establishing an upper bound.</p>
+
+<p>To conduct this test:</p>
+<ol>
+ <li>Run an app that captures audio from the microphone and immediately plays the
+ captured data back over the speaker.</li>
+ <li>Create a sound externally,
+ such as tapping a pencil by the microphone. This noise generates a feedback loop.
+ Alternatively, one can inject an impulse into the loop using software.</li>
+ <li>Measure the time between feedback pulses to get the sum of the output latency, input latency, and application overhead.</li>
+</ol>
+
+ <p>This method does not break down the
+ component times, which is important when the output latency
+ and input latency are independent. So this method is not recommended for measuring
+ precise output latency or input latency values in isolation, but might be useful
+ for establishing rough estimates.</p>
+
+ <p>
+ Output latency to on-device speaker can be significantly larger than
+ output latency to headset connector. This is due to speaker correction and protection.
+ </p>
+
+<p>
+We have published an example implementation at
+<a href="https://android.googlesource.com/platform/frameworks/wilhelm/+/master/tests/examples/slesTestFeedback.cpp">slesTestFeedback.cpp</a>.
+This is a command-line app and is built using the platform build environment;
+however it should be straightforward to adopt the code for other environments.
+You will also need the <a href="avoiding_pi.html#nonBlockingAlgorithms">non-blocking</a> FIFO code
+located in the <code>audio_utils</code> library.
+</p>
+
+<h3 id="loopback">Audio Loopback Dongle</h3>
+
+<p>
+ The <a href="loopback.html">Dr. Rick O'Rang audio loopback dongle</a> is handy for
+ measuring round-trip latency over the headset connector.
+ The image below demonstrates the result of injecting an impulse
+ into the loop once, and then allowing the feedback loop to oscillate.
+ The period of the oscillations is the round-trip latency.
+ The specific device, software release, and
+ test conditions are not specified here. The results shown
+ should not be extrapolated.
+</p>
+
+<img src="images/round_trip.png" alt="round-trip measurement" id="figure1" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> Round-trip measurement
+</p>
+
+<p>You may need to remove the USB cable to reduce noise,
+and adjust the volume level to get a stable oscillation.
+</p>
+
+<h2 id="measuringInput">Measuring Input Latency</h2>
+
+<p>
+ Input latency is more difficult to measure than output latency. The following
+ tests might help.
+</p>
+
+<p>
+One approach is to first determine the output latency
+ using the LED and oscilloscope method and then use
+ the audio feedback (Larsen) test to determine the sum of output
+ latency and input latency. The difference between these two
+ measurements is the input latency.
+</p>
+
+<p>
+ Another technique is to use a GPIO pin on a prototype device.
+ Externally, pulse a GPIO input at the same time that you present
+ an audio signal to the device. Run an app that compares the
+ difference in arrival times of the GPIO signal and audio data.
+</p>
+
+<h2 id="reducing">Reducing Latency</h2>
+
+<p>To achieve low audio latency, pay special attention throughout the
+system to scheduling, interrupt handling, power management, and device
+driver design. Your goal is to prevent any part of the platform from
+blocking a <code>SCHED_FIFO</code> audio thread for more than a couple
+of milliseconds. By adopting such a systematic approach, you can reduce
+audio latency and get the side benefit of more predictable performance
+overall.
+</p>
+
+
+ <p>
+ Audio underruns, when they do occur, are often detectable only under certain
+ conditions or only at the transitions. Try stressing the system by launching
+ new apps and scrolling quickly through various displays. But be aware
+ that some test conditions are so stressful as to be beyond the design
+ goals. For example, taking a bugreport puts such enormous load on the
+ system that it may be acceptable to have an underrun in that case.
+</p>
+
+<p>
+ When testing for underruns:
+</p>
+ <ul>
+ <li>Configure any DSP after the app processor so that it adds
+ minimal latency.</li>
+ <li>Run tests under different conditions
+ such as having the screen on or off, USB plugged in or unplugged,
+ WiFi on or off, Bluetooth on or off, and telephony and data radios
+ on or off.</li>
+ <li>Select relatively quiet music that you're very familiar with, and which is easy
+ to hear underruns in.</li>
+ <li>Use wired headphones for extra sensitivity.</li>
+ <li>Give yourself breaks so that you don't experience "ear fatigue."</li>
+ </ul>
+
+<p>
+ Once you find the underlying causes of underruns, reduce
+ the buffer counts and sizes to take advantage of this.
+ The eager approach of reducing buffer counts and sizes <i>before</i>
+ analyzing underruns and fixing the causes of underruns only
+ results in frustration.
+</p>
+
+<h3 id="tools">Tools</h3>
+<p>
+ <code>systrace</code> is an excellent general-purpose tool
+ for diagnosing system-level performance glitches.
+</p>
+
+<p>
+ The output of <code>dumpsys media.audio_flinger</code> also contains a
+ useful section called "simple moving statistics." This has a summary
+ of the variability of elapsed times for each audio mix and I/O cycle.
+ Ideally, all the time measurements should be about equal to the mean or
+ nominal cycle time. If you see a very low minimum or high maximum, this is an
+ indication of a problem, likely a high scheduling latency or interrupt
+ disable time. The <i>tail</i> part of the output is especially helpful,
+ as it highlights the variability beyond +/- 3 standard deviations.
+</p>
diff --git a/en/devices/audio/latency_measurements.html b/en/devices/audio/latency_measurements.html
new file mode 100644
index 00000000..2811ae00
--- /dev/null
+++ b/en/devices/audio/latency_measurements.html
@@ -0,0 +1,474 @@
+page.title=Audio Latency Measurements
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+This article gives round-trip audio latency measurements for select devices and
+platform versions.
+</p>
+
+<h2 id="definition">Definition</h2>
+
+<p>
+<a href="http://en.wikipedia.org/wiki/Latency_%28engineering%29">Latency</a>
+is an important system performance metric. There are many kinds
+of <a href="latency.html">audio latency</a>
+metrics. One useful and well-understood metric is
+<a href="latency_measure.html#measuringRoundTrip">round-trip latency</a>.
+Round-trip latency is defined as the time it takes for
+an audio signal to enter the input of a mobile device, be processed
+by an app running on the application processor, and exit the output.
+</p>
+
+<img src="images/round_trip_on_device.png" alt="Round-trip audio latency on device" id="figure1" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> Round-trip audio latency on device: T<sub>output</sub> - T<sub>input</sub>
+</p>
+
+<h2 id="why">Why we measure latency</h2>
+
+<p>
+We measure and report latency so Android
+app developers will have the data they need to make informed decisions about available
+latency on actual devices. By sharing these numbers for select Nexus devices, we also hope to
+encourage the entire Android community to measure, publish, and reduce
+latency on <i>all</i> devices.
+Please join us in our commitment to reducing audio latency.
+</p>
+
+<h2 id="app">Application impact on latency</h2>
+
+<p>
+There are two kinds of delays that a signal processing stage can add to latency:
+algorithmic delay and computational delay.
+Algorithmic delay is inherent and does not vary with the CPU.
+An example is the delay added by a
+<a href="http://en.wikipedia.org/wiki/Finite_impulse_response">finite impulse response</a>
+(FIR) filter.
+Computational delay is related to the number of CPU cycles required.
+For example, attenuation of a signal is usually done by a multiplication operation,
+and this multiplication will take a varying number of cycles depending on the CPU.
+</p>
+
+<h2 id="how">How we measure</h2>
+
+<p>
+The measurements below were taken with the
+<a href="loopback.html">Dr. Rick O'Rang audio loopback dongle</a>
+and an
+<a href="latency_measure.html#larsenTest">audio feedback (Larsen effect) test</a>.
+</p>
+
+<p>
+For our measurements, we assume the application signal processing
+adds zero algorithmic delay and near zero computational delay.
+</p>
+
+<p>
+We measure round-trip latency via the headset connector for several reasons:
+</p>
+<ul>
+ <li>
+ There are important music applications, such as guitar and voice processing,
+ that use the headset connector.
+ </li>
+ <li>
+ Measuring round-trip latency of the on-device microphone and speaker can
+ be cumbersome, as it is difficult to keep a feedback loop in open air from entering
+ uncontrolled oscillation.
+ </li>
+ <li>
+ The on-device transducers are small and sacrifice frequency response
+ to achieve their small size. To compensate, digital signal processing is
+ applied but increases algorithmic delay for the on-device path.
+ </li>
+</ul>
+
+<p>
+There are cases where on-device microphone and speaker latencies
+<i>do</i>
+matter, but they are usually for one direction, not round-trip.
+Techniques for measuring unidirectional latency are described at
+<a href="latency_measure.html#measuringOutput">Measuring Output Latency</a>
+and
+<a href="latency_measure.html#measuringInput">Measuring Input Latency</a>.
+</p>
+
+<img src="images/round_trip_via_headset_connector.png" alt="Round-trip latency via headset connector" id="figure2" />
+<p class="img-caption">
+ <strong>Figure 2.</strong> Round-trip latency via headset connector: T<sub>output</sub> - T<sub>input</sub>
+</p>
+
+<h2 id="measurements">Example measurements</h2>
+
+<p>
+The measurements shown are specific to a
+<a href="{@docRoot}source/build-numbers.html">build number</a>.
+Devices are listed in approximate order of initial release and within device by platform version.
+The test application uses the Android native audio API based on OpenSL ES.
+</p>
+
+<table>
+<tr>
+ <th>Model</th>
+ <th>Platform<br />version</th>
+ <th>Build<br />number</th>
+ <th>Sample rate<br />(Hz)</th>
+ <th>Buffer size<br />(frames)</th>
+ <th>Buffer size<br />(ms)</th>
+ <th>Round-trip<br />latency (ms)<br />&plusmn; one buffer</th>
+</tr>
+
+<tr>
+ <td>Nexus One</td>
+ <td>2.3.6</td>
+ <td>GRK39F</td>
+ <td>44100</td>
+ <td>768</td>
+ <td>17.4</td>
+ <td>345</td>
+</tr>
+
+<tr>
+ <td>Nexus S</td>
+ <td>2.3.6</td>
+ <td>GRK39F</td>
+ <td>44100</td>
+ <td>1024</td>
+ <td>23.2</td>
+ <td>260</td>
+</tr>
+
+<tr>
+ <td>Nexus S</td>
+ <td>4.0.4</td>
+ <td>IMM76D</td>
+ <td>44100</td>
+ <td>1024</td>
+ <td>23.2</td>
+ <td>260</td>
+</tr>
+
+<tr>
+ <td>Nexus S</td>
+ <td>4.1.2</td>
+ <td>JZO54K</td>
+ <td>44100</td>
+ <td>880</td>
+ <td>20</td>
+ <td>210</td>
+</tr>
+
+<tr>
+ <td>Galaxy Nexus</td>
+ <td>4.0.1</td>
+ <td>ITL41D</td>
+ <td>44100</td>
+ <td>976</td>
+ <td>22.1</td>
+ <td>270</td>
+</tr>
+
+<tr>
+ <td>Galaxy Nexus</td>
+ <td>4.3</td>
+ <td>JWR66Y</td>
+ <td>44100</td>
+ <td>144</td>
+ <td>3.3</td>
+ <td>130</td>
+</tr>
+
+<tr>
+ <td>Nexus 4</td>
+ <td>4.2.2</td>
+ <td>JDQ39E</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>195</td>
+</tr>
+
+<tr>
+ <td>Nexus 4</td>
+ <td>5.1</td>
+ <td>LMY47O</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>58</td>
+</tr>
+
+<tr>
+ <td>Nexus 10</td>
+ <td>5.0.2</td>
+ <td>LRX22G</td>
+ <td>44100</td>
+ <td>256</td>
+ <td>5.8</td>
+ <td>36</td>
+</tr>
+
+<tr>
+ <td>Nexus 10</td>
+ <td>5.1</td>
+ <td>LMY47D</td>
+ <td>44100</td>
+ <td>256</td>
+ <td>5.8</td>
+ <td>35</td>
+</tr>
+
+<tr>
+ <td>Nexus 7<br />2013</td>
+ <td>4.3</td>
+ <td>JSR78D</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>149</td>
+</tr>
+
+<tr>
+ <td>Nexus 7<br />2013</td>
+ <td>4.4</td>
+ <td>KRT16S</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>85</td>
+</tr>
+
+<tr>
+ <td>Nexus 7<br />2013</td>
+ <td>5.0.2</td>
+ <td>LRX22G</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>64</td>
+</tr>
+
+<tr>
+ <td>Nexus 7<br />2013</td>
+ <td>5.1</td>
+ <td>LMY47O</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>55</td>
+</tr>
+
+<tr>
+ <td>Nexus 7<br />2013</td>
+ <td>6.0</td>
+ <td>MRA58K</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>55</td>
+</tr>
+
+<tr>
+ <td>Nexus 5</td>
+ <td>4.4.4</td>
+ <td>KTU84P</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>95</td>
+</tr>
+
+<tr>
+ <td>Nexus 5</td>
+ <td>5.0.0</td>
+ <td>LRX21O</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>47</td>
+</tr>
+
+<tr>
+ <td>Nexus 5</td>
+ <td>5.1</td>
+ <td>LMY47I</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>42</td>
+</tr>
+
+<tr>
+ <td>Nexus 5</td>
+ <td>6.0</td>
+ <td>MRA58K</td>
+ <td>48000</td>
+ <td>192</td>
+ <td>4</td>
+ <td>38</td>
+</tr>
+
+<tr>
+ <td>Nexus 9</td>
+ <td>5.0.0</td>
+ <td>LRX21L</td>
+ <td>48000</td>
+ <td>256</td>
+ <td>5.3</td>
+ <td>35</td>
+</tr>
+
+<tr>
+ <td>Nexus 9</td>
+ <td>5.0.1</td>
+ <td>LRX22C</td>
+ <td>48000</td>
+ <td>256</td>
+ <td>5.3</td>
+ <td>38</td>
+</tr>
+
+<tr>
+ <td>Nexus 9</td>
+ <td>5.1.1</td>
+ <td>LMY47X</td>
+ <td>48000</td>
+ <td>256</td>
+ <td>5.3</td>
+ <td>32</td>
+</tr>
+
+<tr>
+ <td>Nexus 9</td>
+ <td>6.0</td>
+ <td>MRA58K</td>
+ <td>48000</td>
+ <td>128</td>
+ <td>2.6</td>
+ <td>15</td>
+</tr>
+
+<tr>
+ <td>Nexus 6</td>
+ <td>5.0.1</td>
+ <td>LRX22C</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>65</td>
+</tr>
+
+<tr>
+ <td>Nexus 6</td>
+ <td>5.1</td>
+ <td>LMY47I</td>
+ <td>48000</td>
+ <td>240</td>
+ <td>5</td>
+ <td>42</td>
+</tr>
+
+<tr>
+ <td>Nexus 6</td>
+ <td>6.0</td>
+ <td>MRA58K</td>
+ <td>48000</td>
+ <td>192</td>
+ <td>4</td>
+ <td>33</td>
+</tr>
+
+<tr>
+ <td>Nexus 5X</td>
+ <td>6.0</td>
+ <td>MDA89E</td>
+ <td>48000</td>
+ <td>192</td>
+ <td>4</td>
+ <td>18</td>
+</tr>
+
+<tr>
+ <td>Nexus 6P</td>
+ <td>6.0</td>
+ <td>MDA89D</td>
+ <td>48000</td>
+ <td>192</td>
+ <td>4</td>
+ <td>18</td>
+</tr>
+
+</table>
+
+<p></p>
+<p></p>
+
+<script type="text/javascript" src="https://www.google.com/jsapi?autoload={'modules':[{'name':'visualization','version':'1.1','packages':['bar']}]}"></script>
+ <script type="text/javascript">
+
+
+google.setOnLoadCallback(drawChart);
+ function drawChart() {
+ var data = google.visualization.arrayToDataTable([
+ ['Device', '2.3', '4.0', '4.1', '4.2', '4.3', '4.4', '5.0', '5.1', '6.0'],
+ ['Nexus One', 345, null, null, null, null, null, null, null, null,],
+ ['Nexus S', 260, 260, 210, null, null, null, null, null, null,],
+ ['Galaxy Nexus', null, 270, null, null, 130, null, null, null, null,],
+ ['Nexus 4', null, null, null, 195, null, null, null, 58, null,],
+ ['Nexus 10', null, null, null, null, null, null, 36, 35, null,],
+ ['Nexus 7 2013', null, null, null, null, 149, 85, 64, 55, 55,],
+ ['Nexus 5', null, null, null, null, null, 95, 47, 42, 38,],
+ ['Nexus 9', null, null, null, null, null, null, 38, 32, 15,],
+ ['Nexus 6', null, null, null, null, null, null, 65, 42, 33,],
+ ['Nexus 5X', null, null, null, null, null, null, null, null, 18,],
+ ['Nexus 6P', null, null, null, null, null, null, null, null, 18,]
+ ]);
+
+ var options = {
+ chart: {
+ title: 'Round Trip Audio Latency',
+ subtitle: 'Over headset, using native APIs',
+ },
+ bars: 'horizontal', // Required for Material Bar Charts.
+ bar: {groupWidth: '100%'},
+ hAxis: {
+ title: 'Milliseconds'
+ },
+ height: 800,
+ width: 600
+ };
+
+ var chart = new google.charts.Bar(document.getElementById('chart_div'));
+
+ chart.draw(data, google.charts.Bar.convertOptions(options));
+
+ }
+</script>
+
+ <div id="chart_div"></div>
+ <p></p>
+ <p class="img-caption">
+<strong>Figure 3.</strong> Round trip latencies.</p> \ No newline at end of file
diff --git a/en/devices/audio/loopback.html b/en/devices/audio/loopback.html
new file mode 100644
index 00000000..933972f1
--- /dev/null
+++ b/en/devices/audio/loopback.html
@@ -0,0 +1,58 @@
+page.title=Audio Loopback Dongle
+@jd:body
+
+<!--
+ Copyright 2014 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+The diagram and photo below show an audio loopback
+<a href="http://en.wikipedia.org/wiki/Dongle">dongle</a>
+for the
+<a href="http://en.wikipedia.org/wiki/Phone_connector_(audio)">headset connector</a>
+that we call the "Dr. Rick O'Rang audio loopback dongle."
+The Chrome hardware team designed this circuit and plug for functional testing;
+however it has many other uses too. The Android audio team uses it to measure
+<a href="latency_measure.html#measuringRoundTrip">round-trip audio latency</a>,
+via the Larsen effect (feedback loop).
+</p>
+
+<h2 id="loopbackCircuit">Circuit</h2>
+
+<img src="images/loopback_circuit.png" alt="circuit" id="figure1" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> circuit diagram
+</p>
+
+<p>
+To ensure that the output signal will not overload the microphone input,
+we cut it down by about 20dB.
+The resistor loads tell the microphone polarity switch that
+the audio loopback dongle is a US/CTIA pinout Tip Ring Ring Shield (TRRS) plug.
+</p>
+
+<h2 id="loopbackAssembled">Assembled</h2>
+
+<img src="images/loopback_assembled.jpg" alt="fully assembled" id="figure2" />
+<p class="img-caption">
+ <strong>Figure 2.</strong> Assembled
+</p>
diff --git a/en/devices/audio/midi.html b/en/devices/audio/midi.html
new file mode 100644
index 00000000..94dbee21
--- /dev/null
+++ b/en/devices/audio/midi.html
@@ -0,0 +1,178 @@
+page.title=MIDI
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+<a href="http://en.wikipedia.org/wiki/MIDI">MIDI</a> (Musical Instrument Digital Interface)
+is a standard protocol for inter-connecting computers with musical instruments, stage lighting,
+and other time-oriented media.
+</p>
+
+<p>
+Strictly speaking, MIDI is unrelated to audio. But since MIDI is commonly used with
+music, this article is placed in the audio section.
+</p>
+
+<h2 id="transports">Transports</h2>
+
+<p>
+The physical <a href="http://en.wikipedia.org/wiki/Transport_layer">transport layer</a>
+specified in original MIDI 1.0 is a current loop with
+<a href="http://en.wikipedia.org/wiki/DIN_connector">5-pin DIN</a> connector.
+</p>
+
+<p>
+Since MIDI 1.0, additional transports have been defined, including MIDI over USB
+and a proposed draft for MIDI over
+<a href="http://en.wikipedia.org/wiki/Bluetooth_low_energy">Bluetooth Low Energy</a> (BLE).
+</p>
+
+<h2 id="for-android">MIDI for Android</h2>
+
+<p>
+Android 3.1 and later support
+<a href="http://en.wikipedia.org/wiki/USB_On-The-Go">USB On-The-Go</a>,
+which permits an Android device to act as USB host to drive USB
+peripherals. The USB host mode APIs introduced in Android 3.1 permit
+developers to implement MIDI over USB at the application level, but until
+recently there have been no built-in platform APIs for MIDI.
+</p>
+
+<p>
+Beginning with the Android 6.0 (Marshmallow) release, device makers can enable optional MIDI support in the platform.
+Android directly supports USB, draft BLE, and virtual (inter-app) transports.
+Android indirectly supports MIDI 1.0 via an external adapter.
+</p>
+
+<p>
+For details on application programming with the new MIDI APIs, see the
+<a href="https://developer.android.com/reference/android/media/midi/package-summary.html"><code>android.media.midi</code></a>
+package.
+</p>
+
+<p>
+The remainder of this article discusses how an Android device maker can
+enable MIDI support in the platform.
+</p>
+
+<h2 id="transport">Enabling transports</h2>
+
+<p>
+The implementation depends on ALSA for USB host mode and USB peripheral mode transports.
+ALSA is not used for the BLE and virtual transports.
+</p>
+
+<h3 id="usb-host">USB host mode</h3>
+
+<p>
+To enable MIDI for USB host mode, first support USB host mode in general, and
+then enable <code>CONFIG_SND_RAWMIDI</code> and <code>CONFIG_SND_USB_MIDI</code> in your kernel configuration.
+See <a href="{@docRoot}devices/tech/config/kernel.html">Android Kernel Configuration.</a>
+</p>
+
+<p>
+The MIDI over USB transport is formally defined by the
+<a href="http://www.usb.org/developers/docs/devclass_docs/midi10.pdf">
+Universal Serial Bus Device Class Definition for MIDI Devices Release 1.0 Nov 1, 1999</a>
+standard published by the
+<a href="http://www.usb.org/">USB Implementers Forum, Inc</a>.
+</p>
+
+<h3 id="usb-peripheral">USB peripheral mode</h3>
+
+<p>
+To enable MIDI for USB peripheral mode, you may need to apply patches
+to your Linux kernel to integrate the
+<code>drivers/usb/gadget/f_midi.c</code> into the USB gadget
+driver. As of this writing, these patches are available for Linux kernel version
+3.10. These patches have not yet been updated for
+<a href="http://en.wikipedia.org/wiki/Configfs">ConfigFs</a>
+(a new architecture
+for USB gadget drivers), nor are they merged at upstream
+<a href="http://kernel.org">kernel.org</a>.
+</p>
+
+<p>
+The patches are shown in commit order for the kernel tree at project <code>kernel/common</code>
+branch <code>android-3.10</code>:
+</p>
+<ol>
+<li><a href="https://android-review.googlesource.com/#/c/127450/">https://android-review.googlesource.com/#/c/127450/</a></li>
+<li><a href="https://android-review.googlesource.com/#/c/127452/">https://android-review.googlesource.com/#/c/127452/</a></li>
+<li><a href="https://android-review.googlesource.com/#/c/143714/">https://android-review.googlesource.com/#/c/143714/</a></li>
+</ol>
+
+<p>
+In addition, the end user must also check the box for MIDI
+in the <em>Settings / Developer options / Networking / Select USB Configuration</em> dialog,
+or by pulling down from the top of screen while attached
+to the USB host, selecting entry "USB for ...", and then choosing <strong>MIDI</strong>.
+</p>
+
+<h3 id="ble">BLE</h3>
+
+<p>
+MIDI over BLE is always enabled, provided the device supports BLE.
+As this transport is in draft status, it is subject to change.
+</p>
+
+<h3 id="virtual">Virtual (inter-app)</h3>
+
+<p>
+The virtual (inter-app) transport is always enabled.
+</p>
+
+<h2 id="claim-feature">Claiming the feature</h2>
+
+<p>
+Applications can screen for the presence of MIDI support using the
+<code>android.software.midi</code> feature.
+</p>
+
+<p>
+To claim MIDI support, add this line to your <code>device.mk</code>:
+</p>
+<pre>
+PRODUCT_COPY_FILES += \
+frameworks/native/data/etc/android.software.midi.xml:system/etc/permissions/android.software.midi.xml
+</pre>
+
+<p>
+See the
+<a href="{@docRoot}compatibility/android-cdd.pdf">Android Compatibility Definition Document (CDD)</a>
+for information
+on requirements to claim the feature.
+</p>
+
+<h2 id="hostDebugging">Debugging while in host mode</h2>
+
+<p>
+While in USB host mode, Android Debug Bridge (adb) debugging over USB is unavailable.
+See section <a href="http://developer.android.com/tools/help/adb.html#wireless">Wireless usage</a>
+of
+<a href="http://developer.android.com/tools/help/adb.html">Android Debug Bridge</a>
+for an alternative.
+</p>
diff --git a/en/devices/audio/midi_arch.html b/en/devices/audio/midi_arch.html
new file mode 100644
index 00000000..816449d6
--- /dev/null
+++ b/en/devices/audio/midi_arch.html
@@ -0,0 +1,231 @@
+page.title=MIDI Architecture
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+This article describes the generic MIDI architecture, independent of
+any platform implementation, API, or platform-specific features.
+</p>
+
+<h2 id="keyConcepts">Key concepts</h2>
+
+<h3 id="events">Events</h3>
+
+<p>
+The MIDI protocol is designed for event-based communication.
+An <a href="https://en.wikipedia.org/wiki/Event_(computing)">event</a>
+is an indication that something happened or will happen at a specified
+time. MIDI events are represented by <em>messages</em>, atomic
+bundles of information.
+</p>
+
+<h3 id="transport">Transport</h3>
+
+<p>
+MIDI messages are encoded and delivered via a
+<a href="https://en.wikipedia.org/wiki/Transport_layer">transport layer</a>,
+abbreviated <em>transport</em>, which sends the raw MIDI data
+to the recipient who then decodes the data into messages.
+</p>
+
+<p>
+Hardware-based MIDI transports include:
+</p>
+<ul>
+<li>MIDI 1.0 current loop with
+<a href="https://en.wikipedia.org/wiki/DIN_connector">5-pin DIN</a> connector</li>
+<li>USB</li>
+<li>Bluetooth Low Energy (BLE)</li>
+</ul>
+
+<h3 id="messageRepresentation">Message representation</h3>
+
+<p>
+A MIDI transport specification describes how to convey messages.
+Although the packaging of messages is transport-specific at the
+lowest level, at a higher level applications can consider a
+time-ordered sequence of messages to be a demarcated
+<a href="https://en.wikipedia.org/wiki/Bytestream">byte stream</a>.
+This is possible because each message contains
+enough information to determine the total length of the message,
+provided the start of the message boundary is known.
+</p>
+
+<p>
+Most MIDI messages are short (one to three bytes), yet there is the
+capability for longer messages via <em>SysEx</em>.
+</p>
+
+<h3 id="timestamps">Timestamps</h3>
+
+<p>
+A <a href="https://en.wikipedia.org/wiki/Timestamp">timestamp</a>
+is an optional label attached to a message at origination or upon receipt,
+depending on the transport. The timestamp is expressed in time units
+such as seconds or
+<a href="https://en.wikipedia.org/wiki/Jiffy_(time)">ticks</a>.
+</p>
+
+<p>
+In the absence of an explicit timestamp, the system must substitute
+the timestamp of the immediately preceding message or the current
+time. The accuracy of these timestamps, whether explicit or implicit,
+is an important aspect of the reliability of a MIDI-based system.
+</p>
+
+<p>
+Timestamps are not part of the MIDI 1.0 protocol. They are often added
+as part of a platform-specific API. The BLE transport has timestamps
+to indicate the timing of the multiple individual messages sent within
+one BLE packet.
+</p>
+
+<h3 id="devices">Devices</h3>
+
+<p>
+A <a href="https://en.wikipedia.org/wiki/Peripheral">peripheral</a>
+provides input/output (I/O) capability for a computer. The terms
+<em>MIDI peripheral</em> and <em>MIDI device</em> commonly
+refer to any hardware or software module that supports the MIDI protocol.
+Within this document, <em>MIDI peripheral</em> refers to the
+physical entity and <em>MIDI device</em> describes the module that
+actually implements MIDI.
+</p>
+
+<h3 id="ports">Ports</h3>
+
+<p>
+A <a href="https://en.wikipedia.org/wiki/Computer_port_(hardware)">port</a>
+is an interface point between computers and peripherals.
+</p>
+
+<p>
+MIDI 1.0 uses a female 5-pin DIN socket as the port.
+Each port is either <em>OUT</em> (source of MIDI data), <em>IN</em> (sink for MIDI data),
+or <em>THRU</em> (meaning an <em>IN</em> which is directly routed to an <em>OUT</em>).
+</p>
+
+<p>
+Other transports such as USB and BLE extend the
+<a href="https://en.wikipedia.org/wiki/Computer_port_(software)">port concept</a>.
+</p>
+
+<p>
+A MIDI device has at least one <em>OUT</em> port, <em>IN</em> port, or both.
+</p>
+
+<p>
+The MIDI device supplies stream(s) of messages originating at each <em>OUT</em> port,
+and receives stream(s) of messages arriving at each <em>IN</em> port.
+The terms <em>IN</em> and <em>OUT</em> are of course relative to one port;
+from the perspective of the other port the reverse term applies.
+</p>
+
+<h3 id="connection">Connection</h3>
+
+<p>
+In the MIDI 1.0 transport, an <em>OUT</em> port connects to at most
+one <em>IN</em> or <em>THRU</em> port due to the nature of the current loop.
+In USB and BLE transports, the same is true at the lowest layer, though
+an implementation may re-condition the message stream so that it can
+be broadcast to multiple <em>IN</em> ports.
+</p>
+
+<h3 id="cable">Cables</h3>
+
+<p>
+A MIDI 1.0 <a href="https://en.wikipedia.org/wiki/Cable">cable</a> is the
+physical bundle of wires that connects an <em>OUT</em> port to an <em>IN</em> or <em>THRU</em> port.
+The cable carries data only.
+</p>
+
+<p class="note">
+<strong>Note:</strong>
+There are non-standard modifications to MIDI that supply power over the
+two unused pins. This is called <em>phantom power</em>.
+</p>
+
+<p>
+A <a href="https://en.wikipedia.org/wiki/USB#Cabling">USB cable</a>
+is similar, except there is a wide variety of connector types,
+and the <em>IN</em>/<em>OUT</em>/<em>THRU</em> concept is replaced by the host/peripheral role.
+</p>
+
+<p>
+When operating in USB host mode, the host device supplies power to the
+MIDI peripheral. Most small MIDI peripherals take one USB unit load (100
+mA) or less. However some larger peripherals, or peripherals with audio
+output or lights, require more power than the host device can supply.
+If you experience problems, try another MIDI peripheral or a powered
+USB hub.
+</p>
+
+<h3 id="channel">Channel</h3>
+
+<p>
+Each MIDI message stream in multiplexed among 16 <em>channels</em>.
+Most messages are directed at a specific channel,
+but there are message types that aren't channel-specific.
+Conventionally the channels are numbered one to 16, though
+represented by channel values of zero to 15.
+</p>
+
+<p>
+If the application needs more than 16 channels or a higher throughput
+than one message stream can support, then multiple ports
+must be used.
+</p>
+
+<p>
+In MIDI 1.0, this is accomplished by multiple cables connecting pairs of ports.
+</p>
+
+<p>
+In the MIDI over USB transport, a single USB endpoint can support multiple
+ports, each identified by a <em>cable number</em> [sic].
+According to the USB MIDI specification,
+the <em>cable number</em> identifies the virtual port within the endpoint.
+</p>
+
+<p class="note">
+<strong>Note:</strong>
+<em>port number</em> would have been a more accurate term,
+given that it identifies a port.
+</p>
+
+<p>
+Thus a single USB physical cable can carry more than one set of 16 channels.
+</p>
+
+<h2 id="platformImplementation">Platform implementation</h2>
+
+<p>
+As noted in the introduction, these generic MIDI concepts apply to all
+implementations. For the interpretation of the concepts on the Android
+platform, see the
+<a href="http://developer.android.com/reference/android/media/midi/package-summary.html">
+Android MIDI User Guide for <code>android.media.midi</code></a>.
+</p>
diff --git a/en/devices/audio/midi_test.html b/en/devices/audio/midi_test.html
new file mode 100644
index 00000000..e5188b9e
--- /dev/null
+++ b/en/devices/audio/midi_test.html
@@ -0,0 +1,267 @@
+page.title=MIDI Test Procedure
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>These tests may be used to validate the MIDI feature on Android devices.
+Successful execution of these tests is a prerequisite to
+<a href="midi.html#claim-feature">claim the MIDI feature</a>.
+</p>
+
+<h2 id="preparation">Preparation</h2>
+
+
+<h3 id="hardware">Hardware</h3>
+
+<p>
+The following hardware is needed for the tests.
+</p>
+
+<ul>
+ <li> MIDI keyboard with USB connector, e.g. the <a href="http://www.akaipro.com/product/lpk25">Akai LPK25</a></li>
+ <li> MIDI keyboard with Bluetooth Low Energy (BLE) support, e.g. the <a href="http://miselu.com/">Miselu C.24</a></li>
+ <li> USB cables</li>
+ <li> USB On-The-Go (OTG) adapter to convert a female USB-A to male micro-USB or USB-C</li>
+ <li> Android device running Android 6.0 Marshmallow or later release</li>
+ <li> Optional: desktop computer</li>
+</ul>
+
+<h3 id="apps">Apps</h3>
+
+<p>
+Several apps are used by this test procedure.
+The apps are available in source code on GitHub project
+<a href="https://github.com/philburk/android-midisuite">android-midisuite</a>,
+and via <em>Google Play</em>&trade; at links in the following table.
+</p>
+
+<table>
+<tr>
+ <th>Source code</th>
+ <th>Google&nbsp;Play&trade;</th>
+ <th>Description</th>
+</tr>
+<tr>
+ <td><a href="https://github.com/philburk/android-midisuite/tree/master/MidiScope">MidiScope</a> or
+ <a href="https://github.com/googlesamples/android-MidiScope">MidiScope</a></td>
+ <td><a href="https://play.google.com/store/apps/details?id=com.mobileer.example.midiscope">MIDI Scope</a></td>
+ <td>displays MIDI messages on-screen</td>
+</tr>
+<tr>
+ <td><a href="https://github.com/philburk/android-midisuite/tree/master/MidiKeyboard">MidiKeyboard</a></td>
+ <td><a href="https://play.google.com/store/apps/details?id=com.mobileer.midikeyboard">MIDI Keyboard</a></td>
+ <td>sends MIDI messages by pressing an on-screen music keyboard</td>
+</tr>
+<tr>
+ <td><a href="https://github.com/philburk/android-midisuite/tree/master/MidiSynthExample">MidiSynthExample</a> or
+ <br /><a href="https://github.com/googlesamples/android-MidiSynth">MidiSynth</a></td>
+ <td><a href="https://play.google.com/store/apps/details?id=com.mobileer.midisynthexample">MIDI Synth Ex</a></td>
+ <td>simple MIDI synthesizer that uses sawtooth oscillators</td>
+</tr>
+<tr>
+ <td><a href="https://github.com/philburk/android-midisuite/tree/master/MidiBtlePairing">MidiBtlePairing</a></td>
+ <td><a href="https://play.google.com/store/apps/details?id=com.mobileer.example.midibtlepairing">MIDI BLE Connect</a></td>
+ <td>pairs an Android device with a BLE peripheral</td>
+</tr>
+<tr>
+ <td><a href="https://github.com/philburk/android-midisuite/tree/master/MidiTools">MidiTools</a></td>
+ <td></td>
+ <td>library dependency of the above apps</td>
+</tr>
+</table>
+
+<p>
+If you choose to work from source code rather than install via <em>Google Play</em>&trade;,
+first build the app using the supplied <em>Android.mk</em>.
+Then install the app using
+<a href="http://developer.android.com/tools/help/adb.html">Android Debug Bridge</a> (ADB).
+For example, to install the <em>MidiScope</em> app:</p>
+
+<ol>
+ <li> Use a workstation with ADB installed.</li>
+ <li> Connect a USB cable from the workstation to the Android device.</li>
+ <li> You may need to allow the USB connection on the Android device; see <a href="midi.html#usb-peripheral">USB peripheral mode</a></li>
+ <li> On the workstation, enter:</li>
+</ol>
+
+<pre>
+cd <em>&lt;this-folder&gt;</em>
+adb install -r MidiScope.apk
+</pre>
+
+
+<h2 id="virtual_synth_tests">Virtual synth tests</h2>
+
+
+<p>Note that a MIDI input port can have only one connection. So if another app is
+already using an input port, that port will not be available. If you cannot connect to
+an input port then try closing other apps.</p>
+
+<p>Hardware needed: Android device under test</p>
+
+<h3 id="simple_connection">Simple connection</h3>
+
+
+<p>Apps needed: <em>MidiKeyboard</em>, <em>MidiSynthExample</em></p>
+
+<p>This tests device enumeration, virtual devices, port connections, and message
+sending.</p>
+
+<ol>
+ <li> Adjust volume on Android device to about halfway.</li>
+ <li> Orient phone in landscape mode.</li>
+ <li> Launch <em>MidiKeyboard</em> app.</li>
+ <li> Select <strong>SynthExample</strong> from the spinner menu.</li>
+ <li> Play keys. You should hear notes being played in the <em>SynthExample</em> app.</li>
+ <li> Exit the application by pressing the <strong>Back</strong> button so that the port will be
+closed.</li>
+</ol>
+
+<h2 id="host_mode">USB test: host mode</h2>
+
+
+<p>Hardware needed: USB MIDI keyboard, USB cable, OTG adapter</p>
+
+<p>Repeat these tests several times. We have seen the USB stack crash hard on some
+prototype devices if devices were plugged in and unplugged a few times.</p>
+
+<h3 id="keyboard_already_plugged_in">Keyboard already plugged in</h3>
+
+
+<p>Apps needed: <em>MidiSynthExample</em> or <em>MidiScope</em></p>
+
+<p>This tests USB MIDI in host mode.</p>
+
+<ol>
+ <li> Adjust volume on Android device to about halfway.</li>
+ <li> Plug in USB keyboard using the OTG adapter.</li>
+ <li> Launch <em>SynthExample</em> app or the <em>MidiScope</em> app.</li>
+ <li> From the menu select the USB keyboard. It will display the brand.</li>
+ <li> Play notes on the keyboard. If you ran <em>SynthExample</em> then you should hear notes
+being played on the phone. If you ran <em>MidiScope</em> then you should see <em>NoteOn</em> and
+<em>NoteOff</em> messages on-screen.</li>
+ <li> Unplug the keyboard. The <em>Sender for Synth</em> menu should display <em>- - - - -</em>.</li>
+ <li> Exit the application by pressing the <strong>Back</strong> button.</li>
+</ol>
+
+<h3 id="hot_plug_usb_keyboard">Hot-plug USB keyboard</h3>
+
+
+<p>Apps needed: <em>MidiSynthExample</em> or <em>MidiScope</em></p>
+
+<p>This tests USB MIDI in host mode.</p>
+
+<ol>
+ <li> Adjust volume on Android device to about halfway.</li>
+ <li> Make sure there is not a USB MIDI keyboard plugged in.</li>
+ <li> Launch <em>SynthExample</em> app.</li>
+ <li> At middle, next to <em>Sender for Synth</em>, look in menu. You should not see the USB
+keyboard listed.</li>
+ <li> Plug in USB keyboard using the OTG adapter.</li>
+ <li> At middle, next to <em>Sender for Synth</em>, select the USB keyboard. It will display
+the brand.</li>
+ <li> Play notes on the keyboard. You should hear notes being played on the phone.</li>
+ <li> At middle, next to <em>Sender for Synth</em>, select <strong>- - - - -</strong>.</li>
+ <li> Play notes on the keyboard. You should hear nothing.</li>
+ <li> At middle, next to <em>Sender for Synth</em>, select the USB keyboard. It will display
+the brand.</li>
+ <li> Play notes on the keyboard. You should hear notes being played on the phone.</li>
+ <li> Unplug the synthesizer. The <em>Sender for Synth</em> menu should display <em>- - - - -</em>.</li>
+ <li> Exit the application by pressing the <strong>Back</strong> button.</li>
+</ol>
+
+<h2 id="peripheral_mode">USB test: peripheral mode</h2>
+
+
+<p>Hardware needed: USB cable, OTG adapter</p>
+
+<h3 id="android_to_android">Android-to-Android</h3>
+
+
+<p>Apps needed: <em>MidiKeyboard</em> on Android device under test, <em>MidiScope</em> on another
+Android device.</p>
+
+<p>Use Android devices as a peripheral controller for another Android device. To help test
+this mode, use another Android device running in host mode. Note that
+you could modify the test to work with a desktop computer running Digital Audio Workstation (DAW)
+software such as
+GarageBand.</p>
+
+<ol>
+ <li> Connect the USB cable to the Android device under test (Android device <strong>A</strong>).</li>
+ <li> Use an OTG adapter to connect the other end of the cable to a second Android
+device <strong>B</strong> that operates in host mode.</li>
+ <li> On Android device A:
+ <ol>
+ <li> Drag finger down from top of screen.</li>
+ <li> Select <strong>USB for Charging</strong> icon.</li>
+ <li> Select <strong>MIDI</strong>.</li>
+ <li> Launch <em>MidiKeyboard</em> app.</li>
+ <li> Select <strong>Android USB Peripheral Port</strong> from <em>Receiver for Keys</em> menu at top.</li>
+ </ol>
+ </li>
+ <li> On Android device B:
+ <ol>
+ <li> Launch <em>MidiScope</em> app.</li>
+ <li> Select the other Android device as the source.</li>
+ </ol>
+ </li>
+ <li> On Android device A:
+ <ol>
+ <li> Play notes on the keyboard and look for <em>NoteOn</em> and <em>NoteOff</em> on Android device B.</li>
+ </ol>
+ </li>
+ </ol>
+
+<h2 id="bluetooth_le_test">BLE test</h2>
+
+
+<p>Hardware needed: MIDI keyboard supporting BLE</p>
+
+<h3 id="basic_pairing_and_playing">Basic pairing and playing</h3>
+
+
+<p>Apps needed: <em>MidiBtlePairing</em>, <em>MidiSynthExample</em></p>
+
+<p>Test a keyboard connected to Android over BLE.</p>
+
+<ol>
+ <li> Reboot the Android device.</li>
+ <li> Power on the BLE keyboard.<br />
+ (The Miselu C.24 keyboard is powered on by pushing the button near the back so
+that it pops open. The power button on the C.24 pulses blue when in pairing
+mode.)</li>
+ <li> Launch the <em>MidiBtlePairing</em> app. It has a <em>MIDI+BTLE</em> icon.</li>
+ <li> Press the <strong>Bluetooth Scan</strong> button.</li>
+ <li> Select desired BLE peripheral.</li>
+ <li> The app should return to the main page, and you should see the peripheral listed. If
+you are using a C.24, then you will notice that the light should turn green on
+the C.24 to indicate paired mode.</li>
+ <li> Exit the app by pressing the <strong>Home</strong> button, not the <strong>Back</strong> button.</li>
+ <li> Launch the SynthExample app.</li>
+ <li> Select the BLE keyboard as the sender from the menu.</li>
+ <li> You should be able to press keys on the BLE keyboard and hear notes on
+Android.</li>
+</ol>
diff --git a/en/devices/audio/src.html b/en/devices/audio/src.html
new file mode 100644
index 00000000..ab70fee5
--- /dev/null
+++ b/en/devices/audio/src.html
@@ -0,0 +1,118 @@
+page.title=Sample Rate Conversion
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<h2 id="srcIntro">Introduction</h2>
+
+<p>
+See the Wikipedia article
+<a href="http://en.wikipedia.org/wiki/Resampling_(audio)">Resampling (audio)</a>
+for a generic definition of sample rate conversion, also known as "resampling."
+The remainder of this article describes resampling within Android.
+See <a href="terminology.html#srcTerms">Sample Rate Conversion</a> for related terminology.
+</p>
+
+<p>
+Sample rate conversion is the process of changing a
+stream of discrete samples at one sample rate to another stream at
+another sample rate. A sample rate converter, or resampler, is a module
+that implements sample rate conversion. With respect to the resampler,
+the original stream is called the source signal, and the resampled stream is
+the sink signal.
+</p>
+
+<p>
+Resamplers are used in several places in Android.
+For example, an MP3 file may be encoded at 44.1 kHz sample rate and
+needs to be played back on an Android device supporting 48 kHz audio
+internally. In that case, a resampler would be used to upsample the MP3
+output audio from 44.1 kHz source sample rate to a 48 kHz sink sample rate
+used within the Android device.
+</p>
+
+<p>
+The characteristics of a resampler can be expressed using metrics, including:
+</p>
+
+<ul>
+<li>degree of preservation of the overall amplitude of the signal</li>
+<li>degree of preservation of the frequency bandwidth of the signal,
+ subject to limitations of the sink sample rate</li>
+<li>overall latency through the resampler</li>
+<li>consistent phase and group delay with respect to frequency</li>
+<li>computational complexity, expressed in CPU cycles or power draw</li>
+<li>permitted ratios of source and sink sample rates</li>
+<li>ability to dynamically change sample rate ratios</li>
+<li>which digital audio sample formats are supported</li>
+</ul>
+
+<p>
+The ideal resampler would exactly preserve the source signal's amplitude
+and frequency bandwidth (subject to limitations of the sink
+sample rate), have minimal and consistent delay, have minimal
+computational complexity, permit arbitrary and dynamic conversion ratios,
+and support all common digital audio sample formats.
+In practice, ideal resamplers do not exist, and actual resamplers are
+a compromise among these characteristics.
+For example, the goals of ideal quality conflict with short delay and low complexity.
+</p>
+
+<p>
+Android includes a variety of audio resamplers, so that appropriate
+compromises can be made depending on the application use case and load.
+Section <a href="#srcResamplers">Resampler implementations</a>
+below lists the available resamplers, summarizes their characteristics,
+and identifies where they should typically be used.
+</p>
+
+<h2 id="srcResamplers">Resampler implementations</h2>
+
+<p>
+Available resampler implementations change frequently,
+and may be customized by OEMs.
+As of Android 4.4, the default resamplers
+in descending order of signal distortion, and ascending order of
+computational complexity include:
+</p>
+
+<ul>
+<li>linear</li>
+<li>cubic</li>
+<li>sinc with original coefficients</li>
+<li>sinc with revised coefficients</li>
+</ul>
+
+<p>
+In general, the sinc resamplers are more appropriate for higher-quality
+music playback, and the other resamplers should be reserved for cases
+where quality is less important (an example might be "key clicks" or similar).
+</p>
+
+<p>
+The specific resampler implementation selected depends on
+the use case, load, and the value of system property
+<code>af.resampler.quality</code>. For details,
+consult the audio resampler source code in AudioFlinger.
+</p>
diff --git a/en/devices/audio/terminology.html b/en/devices/audio/terminology.html
new file mode 100644
index 00000000..ae07d0d4
--- /dev/null
+++ b/en/devices/audio/terminology.html
@@ -0,0 +1,803 @@
+page.title=Audio Terminology
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+This glossary of audio-related terminology includes widely-used generic terms
+and Android-specific terms.
+</p>
+
+<h2 id="genericTerm">Generic Terms</h2>
+
+<p>
+Generic audio-related terms have conventional meanings.
+</p>
+
+<h3 id="digitalAudioTerms">Digital Audio</h3>
+<p>
+Digital audio terms relate to handling sound using audio signals encoded
+in digital form. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Digital_audio">Digital Audio</a>.
+</p>
+
+<dl>
+
+<dt>acoustics</dt>
+<dd>
+Study of the mechanical properties of sound, such as how the physical
+placement of transducers (speakers, microphones, etc.) on a device affects
+perceived audio quality.
+</dd>
+
+<dt>attenuation</dt>
+<dd>
+Multiplicative factor less than or equal to 1.0, applied to an audio signal
+to decrease the signal level. Compare to <em>gain</em>.
+</dd>
+
+<dt>audiophile</dt>
+<dd>
+Person concerned with a superior music reproduction experience, especially
+willing to make substantial tradeoffs (expense, component size, room design,
+etc.) for sound quality. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Audiophile">audiophile</a>.
+</dd>
+
+<dt>bits per sample or bit depth</dt>
+<dd>
+Number of bits of information per sample.
+</dd>
+
+<dt>channel</dt>
+<dd>
+Single stream of audio information, usually corresponding to one location of
+recording or playback.
+</dd>
+
+<dt>downmixing</dt>
+<dd>
+Decrease the number of channels, such as from stereo to mono or from 5.1 to
+stereo. Accomplished by dropping channels, mixing channels, or more advanced
+signal processing. Simple mixing without attenuation or limiting has the
+potential for overflow and clipping. Compare to <em>upmixing</em>.
+</dd>
+
+<dt>DSD</dt>
+<dd>
+Direct Stream Digital. Proprietary audio encoding based on
+<a href="http://en.wikipedia.org/wiki/Pulse-density_modulation">pulse-density
+modulation</a>. While Pulse Code Modulation (PCM) encodes a waveform as a
+sequence of individual audio samples of multiple bits, DSD encodes a waveform as
+a sequence of bits at a very high sample rate (without the concept of samples).
+Both PCM and DSD represent multiple channels by independent sequences. DSD is
+better suited to content distribution than as an internal representation for
+processing as it can be difficult to apply traditional digital signal processing
+(DSP) algorithms to DSD. DSD is used in <a href="http://en.wikipedia.org/wiki/Super_Audio_CD">Super Audio CD (SACD)</a> and in DSD over PCM (DoP) for USB. For details, refer
+to <a href="http://en.wikipedia.org/wiki/Direct_Stream_Digital">Digital Stream
+Digital</a>.
+</dd>
+
+<dt>duck</dt>
+<dd>
+Temporarily reduce the volume of a stream when another stream becomes active.
+For example, if music is playing when a notification arrives, the music ducks
+while the notification plays. Compare to <em>mute</em>.
+</dd>
+
+<dt>FIFO</dt>
+<dd>
+First In, First Out. Hardware module or software data structure that implements
+<a href="http://en.wikipedia.org/wiki/FIFO">First In, First Out</a>
+queueing of data. In an audio context, the data stored in the queue are
+typically audio frames. FIFO can be implemented by a
+<a href="http://en.wikipedia.org/wiki/Circular_buffer">circular buffer</a>.
+</dd>
+
+<dt>frame</dt>
+<dd>
+Set of samples, one per channel, at a point in time.
+</dd>
+
+<dt>frames per buffer</dt>
+<dd>
+Number of frames handed from one module to the next at one time. The audio HAL
+interface uses the concept of frames per buffer.
+</dd>
+
+<dt>gain</dt>
+<dd>
+Multiplicative factor greater than or equal to 1.0, applied to an audio signal
+to increase the signal level. Compare to <em>attenuation</em>.
+</dd>
+
+<dt>HD audio</dt>
+<dd>
+High-Definition audio. Synonym for high-resolution audio (but different than
+Intel High Definition Audio).
+</dd>
+
+<dt>Hz</dt>
+<dd>
+Units for sample rate or frame rate.
+</dd>
+
+<dt>high-resolution audio</dt>
+<dd>
+Representation with greater bit-depth and sample rate than CDs (stereo 16-bit
+PCM at 44.1 kHz) and without lossy data compression. Equivalent to HD audio.
+For details, refer to
+<a href="http://en.wikipedia.org/wiki/High-resolution_audio">high-resolution
+audio</a>.
+</dd>
+
+<dt>latency</dt>
+<dd>
+Time delay as a signal passes through a system.
+</dd>
+
+<dt>lossless</dt>
+<dd>
+A <a href="http://en.wikipedia.org/wiki/Lossless_compression">lossless data
+compression algorithm</a> that preserves bit accuracy across encoding and
+decoding, where the result of decoding previously encoded data is equivalent
+to the original data. Examples of lossless audio content distribution formats
+include <a href="http://en.wikipedia.org/wiki/Compact_disc">CDs</a>, PCM within
+<a href="http://en.wikipedia.org/wiki/WAV">WAV</a>, and
+<a href="http://en.wikipedia.org/wiki/FLAC">FLAC</a>.
+The authoring process may reduce the bit depth or sample rate from that of the
+<a href="http://en.wikipedia.org/wiki/Audio_mastering">masters</a>; distribution
+formats that preserve the resolution and bit accuracy of masters are the subject
+of high-resolution audio.
+</dd>
+
+<dt>lossy</dt>
+<dd>
+A <a href="http://en.wikipedia.org/wiki/Lossy_compression">lossy data
+compression algorithm</a> that attempts to preserve the most important features
+of media across encoding and decoding where the result of decoding previously
+encoded data is perceptually similar to the original data but not identical.
+Examples of lossy audio compression algorithms include MP3 and AAC. As analog
+values are from a continuous domain and digital values are discrete, ADC and DAC
+are lossy conversions with respect to amplitude. See also <em>transparency</em>.
+</dd>
+
+<dt>mono</dt>
+<dd>
+One channel.
+</dd>
+
+<dt>multichannel</dt>
+<dd>
+See <em>surround sound</em>. In strict terms, <em>stereo</em> is more than one
+channel and could be considered multichannel; however, such usage is confusing
+and thus avoided.
+</dd>
+
+<dt>mute</dt>
+<dd>
+Temporarily force volume to be zero, independent from the usual volume controls.
+</dd>
+
+<dt>overrun</dt>
+<dd>
+Audible <a href="http://en.wikipedia.org/wiki/Glitch">glitch</a> caused by
+failure to accept supplied data in sufficient time. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Buffer_underrun">buffer underrun</a>.
+Compare to <em>underrun</em>.
+</dd>
+
+<dt>panning</dt>
+<dd>
+Direct a signal to a desired position within a stereo or multichannel field.
+</dd>
+
+<dt>PCM</dt>
+<dd>
+Pulse Code Modulation. Most common low-level encoding of digital audio. The
+audio signal is sampled at a regular interval, called the sample rate, then
+quantized to discrete values within a particular range depending on the bit
+depth. For example, for 16-bit PCM the sample values are integers between
+-32768 and +32767.
+</dd>
+
+<dt>ramp</dt>
+<dd>
+Gradually increase or decrease the level of a particular audio parameter, such
+as the volume or the strength of an effect. A volume ramp is commonly applied
+when pausing and resuming music to avoid a hard audible transition.
+</dd>
+
+<dt>sample</dt>
+<dd>
+Number representing the audio value for a single channel at a point in time.
+</dd>
+
+<dt>sample rate or frame rate</dt>
+<dd>
+Number of frames per second. While <em>frame rate</em> is more accurate,
+<em>sample rate</em> is conventionally used to mean frame rate.
+</dd>
+
+<dt>sonification</dt>
+<dd>
+Use of sound to express feedback or information, such as touch sounds and
+keyboard sounds.
+</dd>
+
+<dt>stereo</dt>
+<dd>
+Two channels.
+</dd>
+
+<dt>stereo widening</dt>
+<dd>
+Effect applied to a stereo signal to make another stereo signal that sounds
+fuller and richer. The effect can also be applied to a mono signal, where it is
+a type of upmixing.
+</dd>
+
+<dt>surround sound</dt>
+<dd>
+Techniques for increasing the ability of a listener to perceive sound position
+beyond stereo left and right.
+</dd>
+
+<dt>transparency</dt>
+<dd>
+Ideal result of lossy data compression. Lossy data conversion is transparent if
+it is perceptually indistinguishable from the original by a human subject. For
+details, refer to
+<a href="http://en.wikipedia.org/wiki/Transparency_%28data_compression%29">Transparency</a>.
+
+</dd>
+
+<dt>underrun</dt>
+<dd>
+Audible <a href="http://en.wikipedia.org/wiki/Glitch">glitch</a> caused by
+failure to supply needed data in sufficient time. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Buffer_underrun">buffer underrun</a>.
+Compare to <em>overrun</em>.
+</dd>
+
+<dt>upmixing</dt>
+<dd>
+Increase the number of channels, such as from mono to stereo or from stereo to
+surround sound. Accomplished by duplication, panning, or more advanced signal
+processing. Compare to <em>downmixing</em>.
+</dd>
+
+<dt>virtualizer</dt>
+<dd>
+Effect that attempts to spatialize audio channels, such as trying to simulate
+more speakers or give the illusion that sound sources have position.
+</dd>
+
+<dt>volume</dt>
+<dd>
+Loudness, the subjective strength of an audio signal.
+</dd>
+
+</dl>
+
+<h3 id="interDeviceTerms">Inter-device interconnect</h3>
+
+<p>
+Inter-device interconnection technologies connect audio and video components
+between devices and are readily visible at the external connectors. The HAL
+implementer and end user should be aware of these terms.
+</p>
+
+<dl>
+
+<dt>Bluetooth</dt>
+<dd>
+Short range wireless technology. For details on the audio-related
+<a href="http://en.wikipedia.org/wiki/Bluetooth_profile">Bluetooth profiles</a>
+and
+<a href="http://en.wikipedia.org/wiki/Bluetooth_protocols">Bluetooth protocols</a>,
+refer to <a href="http://en.wikipedia.org/wiki/Bluetooth_profile#Advanced_Audio_Distribution_Profile_.28A2DP.29">A2DP</a> for
+music, <a href="http://en.wikipedia.org/wiki/Bluetooth_protocols#Synchronous_connection-oriented_.28SCO.29_link">SCO</a> for telephony, and <a href="http://en.wikipedia.org/wiki/List_of_Bluetooth_profiles#Audio.2FVideo_Remote_Control_Profile_.28AVRCP.29">Audio/Video Remote Control Profile (AVRCP)</a>.
+</dd>
+
+<dt>DisplayPort</dt>
+<dd>
+Digital display interface by the Video Electronics Standards Association (VESA).
+</dd>
+
+<dt>dongle</dt>
+<dd>
+A <a href="https://en.wikipedia.org/wiki/Dongle">dongle</a>
+is a small gadget, especially one that hangs off another device.
+</dd>
+
+<dt>HDMI</dt>
+<dd>
+High-Definition Multimedia Interface. Interface for transferring audio and
+video data. For mobile devices, a micro-HDMI (type D) or MHL connector is used.
+</dd>
+
+<dt>Intel HDA</dt>
+<dd>
+Intel High Definition Audio (do not confuse with generic <em>high-definition
+audio</em> or <em>high-resolution audio</em>). Specification for a front-panel
+connector. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Intel_High_Definition_Audio">Intel High
+Definition Audio</a>.
+</dd>
+
+<dt>interface</dt>
+<dd>
+An <a href="https://en.wikipedia.org/wiki/Interface_(computing)">interface</a>
+converts a signal from one representation to another. Common interfaces
+include a USB audio interface and MIDI interface.
+</dd>
+
+<dt>line level</dt>
+<dd>
+<a href="http://en.wikipedia.org/wiki/Line_level">Line level</a> is the strength
+of an analog audio signal that passes between audio components, not transducers.
+</dd>
+
+<dt>MHL</dt>
+<dd>
+Mobile High-Definition Link. Mobile audio/video interface, often over micro-USB
+connector.
+</dd>
+
+<dt>phone connector</dt>
+<dd>
+Mini or sub-mini component that connects a device to wired headphones, headset,
+or line-level amplifier.
+</dd>
+
+<dt>SlimPort</dt>
+<dd>
+Adapter from micro-USB to HDMI.
+</dd>
+
+<dt>S/PDIF</dt>
+<dd>
+Sony/Philips Digital Interface Format. Interconnect for uncompressed PCM. For
+details, refer to <a href="http://en.wikipedia.org/wiki/S/PDIF">S/PDIF</a>.
+S/PDIF is the consumer grade variant of <a href="https://en.wikipedia.org/wiki/AES3">AES3</a>.
+</dd>
+
+<dt>Thunderbolt</dt>
+<dd>
+Multimedia interface that competes with USB and HDMI for connecting to high-end
+peripherals. For details, refer to <a href="http://en.wikipedia.org/wiki/Thunderbolt_%28interface%29">Thunderbolt</a>.
+</dd>
+
+<dt>TOSLINK</dt>
+<dd>
+<a href="https://en.wikipedia.org/wiki/TOSLINK">TOSLINK</a> is an optical audio cable
+used with <em>S/PDIF</em>.
+</dd>
+
+<dt>USB</dt>
+<dd>
+Universal Serial Bus. For details, refer to
+<a href="http://en.wikipedia.org/wiki/USB">USB</a>.
+</dd>
+
+</dl>
+
+<h3 id="intraDeviceTerms">Intra-device interconnect</h3>
+
+<p>
+Intra-device interconnection technologies connect internal audio components
+within a given device and are not visible without disassembling the device. The
+HAL implementer may need to be aware of these, but not the end user. For details
+on intra-device interconnections, refer to the following articles:
+</p>
+<ul>
+<li><a href="http://en.wikipedia.org/wiki/General-purpose_input/output">GPIO</a></li>
+<li><a href="http://en.wikipedia.org/wiki/I%C2%B2C">I²C</a>, for control channel</li>
+<li><a href="http://en.wikipedia.org/wiki/I%C2%B2S">I²S</a>, for audio data, simpler than SLIMbus</li>
+<li><a href="http://en.wikipedia.org/wiki/McASP">McASP</a></li>
+<li><a href="http://en.wikipedia.org/wiki/SLIMbus">SLIMbus</a></li>
+<li><a href="http://en.wikipedia.org/wiki/Serial_Peripheral_Interface_Bus">SPI</a></li>
+<li><a href="http://en.wikipedia.org/wiki/AC%2797">AC'97</a></li>
+<li><a href="http://en.wikipedia.org/wiki/Intel_High_Definition_Audio">Intel HDA</a></li>
+<li><a href="http://mipi.org/specifications/soundwire">SoundWire</a></li>
+</ul>
+
+<p>
+In
+<a href="http://www.alsa-project.org/main/index.php/ASoC">ALSA System on Chip (ASoC)</a>,
+these are collectively called
+<a href="https://www.kernel.org/doc/Documentation/sound/alsa/soc/DAI.txt">Digital Audio Interfaces</a>
+(DAI).
+</p>
+
+<h3 id="signalTerms">Audio Signal Path</h3>
+
+<p>
+Audio signal path terms relate to the signal path that audio data follows from
+an application to the transducer or vice-versa.
+</p>
+
+<dl>
+
+<dt>ADC</dt>
+<dd>
+Analog-to-digital converter. Module that converts an analog signal (continuous
+in time and amplitude) to a digital signal (discrete in time and amplitude).
+Conceptually, an ADC consists of a periodic sample-and-hold followed by a
+quantizer, although it does not have to be implemented that way. An ADC is
+usually preceded by a low-pass filter to remove any high frequency components
+that are not representable using the desired sample rate. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Analog-to-digital_converter">Analog-to-digital
+converter</a>.
+</dd>
+
+<dt>AP</dt>
+<dd>
+Application processor. Main general-purpose computer on a mobile device.
+</dd>
+
+<dt>codec</dt>
+<dd>
+Coder-decoder. Module that encodes and/or decodes an audio signal from one
+representation to another (typically analog to PCM or PCM to analog). In strict
+terms, <em>codec</em> is reserved for modules that both encode and decode but
+can be used loosely to refer to only one of these. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Audio_codec">Audio codec</a>.
+</dd>
+
+<dt>DAC</dt>
+<dd>
+Digital-to-analog converter. Module that converts a digital signal (discrete in
+time and amplitude) to an analog signal (continuous in time and amplitude).
+Often followed by a low-pass filter to remove high-frequency components
+introduced by digital quantization. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Digital-to-analog_converter">Digital-to-analog
+converter</a>.
+</dd>
+
+<dt>DSP</dt>
+<dd>
+Digital Signal Processor. Optional component typically located after the
+application processor (for output) or before the application processor (for
+input). Primary purpose is to off-load the application processor and provide
+signal processing features at a lower power cost.
+</dd>
+
+<dt>PDM</dt>
+<dd>
+Pulse-density modulation. Form of modulation used to represent an analog signal
+by a digital signal, where the relative density of 1s versus 0s indicates the
+signal level. Commonly used by digital to analog converters. For details, refer
+to <a href="http://en.wikipedia.org/wiki/Pulse-density_modulation">Pulse-density
+modulation</a>.
+</dd>
+
+<dt>PWM</dt>
+<dd>
+Pulse-width modulation. Form of modulation used to represent an analog signal by
+a digital signal, where the relative width of a digital pulse indicates the
+signal level. Commonly used by analog-to-digital converters. For details, refer
+to <a href="http://en.wikipedia.org/wiki/Pulse-width_modulation">Pulse-width
+modulation</a>.
+</dd>
+
+<dt>transducer</dt>
+<dd>
+Converts variations in physical real-world quantities to electrical signals. In
+audio, the physical quantity is sound pressure, and the transducers are the
+loudspeaker and microphone. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Transducer">Transducer</a>.
+</dd>
+
+</dl>
+
+<h3 id="srcTerms">Sample Rate Conversion</h3>
+<p>
+Sample rate conversion terms relate to the process of converting from one
+sampling rate to another.
+</p>
+
+<dl>
+
+<dt>downsample</dt>
+<dd>Resample, where sink sample rate &lt; source sample rate.</dd>
+
+<dt>Nyquist frequency</dt>
+<dd>
+Maximum frequency component that can be represented by a discretized signal at
+1/2 of a given sample rate. For example, the human hearing range extends to
+approximately 20 kHz, so a digital audio signal must have a sample rate of at
+least 40 kHz to represent that range. In practice, sample rates of 44.1 kHz and
+48 kHz are commonly used, with Nyquist frequencies of 22.05 kHz and 24 kHz
+respectively. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Nyquist_frequency">Nyquist frequency</a>
+and
+<a href="http://en.wikipedia.org/wiki/Hearing_range">Hearing range</a>.
+</dd>
+
+<dt>resampler</dt>
+<dd>Synonym for sample rate converter.</dd>
+
+<dt>resampling</dt>
+<dd>Process of converting sample rate.</dd>
+
+<dt>sample rate converter</dt>
+<dd>Module that resamples.</dd>
+
+<dt>sink</dt>
+<dd>Output of a resampler.</dd>
+
+<dt>source</dt>
+<dd>Input to a resampler.</dd>
+
+<dt>upsample</dt>
+<dd>Resample, where sink sample rate &gt; source sample rate.</dd>
+
+</dl>
+
+<h2 id="androidSpecificTerms">Android-Specific Terms</h2>
+
+<p>
+Android-specific terms include terms used only in the Android audio framework
+and generic terms that have special meaning within Android.
+</p>
+
+<dl>
+
+<dt>ALSA</dt>
+<dd>
+Advanced Linux Sound Architecture. An audio framework for Linux that has also
+influenced other systems. For a generic definition, refer to
+<a href="http://en.wikipedia.org/wiki/Advanced_Linux_Sound_Architecture">ALSA</a>.
+In Android, ALSA refers to the kernel audio framework and drivers and not to the
+user-mode API. See also <em>tinyalsa</em>.
+</dd>
+
+<dt>audio device</dt>
+<dd>
+Audio I/O endpoint backed by a HAL implementation.
+</dd>
+
+<dt>AudioEffect</dt>
+<dd>
+API and implementation framework for output (post-processing) effects and input
+(pre-processing) effects. The API is defined at
+<a href="http://developer.android.com/reference/android/media/audiofx/AudioEffect.html">android.media.audiofx.AudioEffect</a>.
+</dd>
+
+<dt>AudioFlinger</dt>
+<dd>
+Android sound server implementation. AudioFlinger runs within the mediaserver
+process. For a generic definition, refer to
+<a href="http://en.wikipedia.org/wiki/Sound_server">Sound server</a>.
+</dd>
+
+<dt>audio focus</dt>
+<dd>
+Set of APIs for managing audio interactions across multiple independent apps.
+For details, see <a href="http://developer.android.com/training/managing-audio/audio-focus.html">Managing Audio Focus</a> and the focus-related methods and constants of
+<a href="http://developer.android.com/reference/android/media/AudioManager.html">android.media.AudioManager</a>.
+</dd>
+
+<dt>AudioMixer</dt>
+<dd>
+Module in AudioFlinger responsible for combining multiple tracks and applying
+attenuation (volume) and effects. For a generic definition, refer to
+<a href="http://en.wikipedia.org/wiki/Audio_mixing_(recorded_music)">Audio mixing (recorded music)</a> (discusses a mixer as a hardware device or software application, rather
+than a software module within a system).
+</dd>
+
+<dt>audio policy</dt>
+<dd>
+Service responsible for all actions that require a policy decision to be made
+first, such as opening a new I/O stream, re-routing after a change, and stream
+volume management.
+</dd>
+
+<dt>AudioRecord</dt>
+<dd>
+Primary low-level client API for receiving data from an audio input device such
+as a microphone. The data is usually PCM format. The API is defined at
+<a href="http://developer.android.com/reference/android/media/AudioRecord.html">android.media.AudioRecord</a>.
+</dd>
+
+<dt>AudioResampler</dt>
+<dd>
+Module in AudioFlinger responsible for <a href="src.html">sample rate conversion</a>.
+</dd>
+
+<dt>audio source</dt>
+<dd>
+An enumeration of constants that indicates the desired use case for capturing
+audio input. For details, see <a href="http://developer.android.com/reference/android/media/MediaRecorder.AudioSource.html">audio source</a>. As of API level 21 and above,
+<a href="attributes.html">audio attributes</a> are preferred.
+</dd>
+
+<dt>AudioTrack</dt>
+<dd>
+Primary low-level client API for sending data to an audio output device such as
+a speaker. The data is usually in PCM format. The API is defined at
+<a href="http://developer.android.com/reference/android/media/AudioTrack.html">android.media.AudioTrack</a>.
+</dd>
+
+<dt>audio_utils</dt>
+<dd>
+Audio utility library for features such as PCM format conversion, WAV file I/O,
+and
+<a href="avoiding_pi.html#nonBlockingAlgorithms">non-blocking FIFO</a>, which is
+largely independent of the Android platform.
+</dd>
+
+<dt>client</dt>
+<dd>
+Usually an application or app client. However, an AudioFlinger client can be a
+thread running within the mediaserver system process, such as when playing media
+decoded by a MediaPlayer object.
+</dd>
+
+<dt>HAL</dt>
+<dd>
+Hardware Abstraction Layer. HAL is a generic term in Android; in audio, it is a
+layer between AudioFlinger and the kernel device driver with a C API (which
+replaces the C++ libaudio).
+</dd>
+
+<dt>FastCapture</dt>
+<dd>
+Thread within AudioFlinger that sends audio data to lower latency fast tracks
+and drives the input device when configured for reduced latency.
+</dd>
+
+<dt>FastMixer</dt>
+<dd>
+Thread within AudioFlinger that receives and mixes audio data from lower latency
+fast tracks and drives the primary output device when configured for reduced
+latency.
+</dd>
+
+<dt>fast track</dt>
+<dd>
+AudioTrack or AudioRecord client with lower latency but fewer features on some
+devices and routes.
+</dd>
+
+<dt>MediaPlayer</dt>
+<dd>
+Higher-level client API than AudioTrack. Plays encoded content or content that
+includes multimedia audio and video tracks.
+</dd>
+
+<dt>media.log</dt>
+<dd>
+AudioFlinger debugging feature available in custom builds only. Used for logging
+audio events to a circular buffer where they can then be retroactively dumped
+when needed.
+</dd>
+
+<dt>mediaserver</dt>
+<dd>
+Android system process that contains media-related services, including
+AudioFlinger.
+</dd>
+
+<dt>NBAIO</dt>
+<dd>
+Non-blocking audio input/output. Abstraction for AudioFlinger ports. The term
+can be misleading as some implementations of the NBAIO API support blocking. The
+key implementations of NBAIO are for different types of pipes.
+</dd>
+
+<dt>normal mixer</dt>
+<dd>
+Thread within AudioFlinger that services most full-featured AudioTrack clients.
+Directly drives an output device or feeds its sub-mix into FastMixer via a pipe.
+</dd>
+
+<dt>OpenSL ES</dt>
+<dd>
+Audio API standard by
+<a href="http://www.khronos.org/">The Khronos Group</a>. Android versions since
+API level 9 support a native audio API that is based on a subset of
+<a href="http://www.khronos.org/opensles/">OpenSL ES 1.0.1</a>.
+</dd>
+
+<dt>silent mode</dt>
+<dd>
+User-settable feature to mute the phone ringer and notifications without
+affecting media playback (music, videos, games) or alarms.
+</dd>
+
+<dt>SoundPool</dt>
+<dd>
+Higher-level client API than AudioTrack. Plays sampled audio clips. Useful for
+triggering UI feedback, game sounds, etc. The API is defined at
+<a href="http://developer.android.com/reference/android/media/SoundPool.html">android.media.SoundPool</a>.
+</dd>
+
+<dt>Stagefright</dt>
+<dd>
+See <a href="{@docRoot}devices/media.html">Media</a>.
+</dd>
+
+<dt>StateQueue</dt>
+<dd>
+Module within AudioFlinger responsible for synchronizing state among threads.
+Whereas NBAIO is used to pass data, StateQueue is used to pass control
+information.
+</dd>
+
+<dt>strategy</dt>
+<dd>
+Group of stream types with similar behavior. Used by the audio policy service.
+</dd>
+
+<dt>stream type</dt>
+<dd>
+Enumeration that expresses a use case for audio output. The audio policy
+implementation uses the stream type, along with other parameters, to determine
+volume and routing decisions. For a list of stream types, see
+<a href="http://developer.android.com/reference/android/media/AudioManager.html">android.media.AudioManager</a>.
+</dd>
+
+<dt>tee sink</dt>
+<dd>
+See <a href="debugging.html#teeSink">Audio Debugging</a>.
+</dd>
+
+<dt>tinyalsa</dt>
+<dd>
+Small user-mode API above ALSA kernel with BSD license. Recommended for HAL
+implementations.
+</dd>
+
+<dt>ToneGenerator</dt>
+<dd>
+Higher-level client API than AudioTrack. Plays dual-tone multi-frequency (DTMF)
+signals. For details, refer to
+<a href="http://en.wikipedia.org/wiki/Dual-tone_multi-frequency_signaling">Dual-tone
+multi-frequency signaling</a> and the API definition at
+<a href="http://developer.android.com/reference/android/media/ToneGenerator.html">android.media.ToneGenerator</a>.
+</dd>
+
+<dt>track</dt>
+<dd>
+Audio stream. Controlled by the AudioTrack or AudioRecord API.
+</dd>
+
+<dt>volume attenuation curve</dt>
+<dd>
+Device-specific mapping from a generic volume index to a specific attenuation
+factor for a given output.
+</dd>
+
+<dt>volume index</dt>
+<dd>
+Unitless integer that expresses the desired relative volume of a stream. The
+volume-related APIs of
+<a href="http://developer.android.com/reference/android/media/AudioManager.html">android.media.AudioManager</a>
+operate in volume indices rather than absolute attenuation factors.
+</dd>
+
+</dl>
diff --git a/en/devices/audio/testing_circuit.html b/en/devices/audio/testing_circuit.html
new file mode 100644
index 00000000..1881e0c8
--- /dev/null
+++ b/en/devices/audio/testing_circuit.html
@@ -0,0 +1,94 @@
+page.title=Light Testing Circuit
+@jd:body
+
+<!--
+ Copyright 2014 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+The file <a href="http://developer.android.com/downloads/partner/audio/av_sync_board.zip">av_sync_board.zip</a>
+contains CAD files for an A/V sync and latency testing
+printed circuit board (PCB).
+The files include a fabrication drawing, EAGLE CAD, schematic, and BOM. See <a
+href="latency.html">Audio Latency</a> for recommended testing methods.
+</p>
+
+<p>
+This PCB
+can be used to help measure the time between flashing the device's
+notification LED or screen backlight, vs. detecting an audio signal.
+When combined with a dual-channel oscilloscope and suitable test app,
+it can show the difference in time between detecting the light and audio.
+That assumes the LED or backlight response time and light detector's response time
+are negligible relative to the audio.
+</p>
+
+<p>
+This design is supplied "as is", and we aren't be responsible for any errors in the design.
+But if you have any suggestions for improvement, please post to the <a
+href="https://groups.google.com/forum/#!forum/android-porting">android-porting</a> group.
+</p>
+
+<p>
+Of course, this is not the only (or necessarily best) way to measure A/V sync and latency,
+and we would like to hear about your alternative methods, also at android-porting group.
+</p>
+
+<p>
+There are currently no compatibility requirements to use this particular PCB.
+We supply it to encourage your continued attention to audio performance.
+</p>
+
+<h2 id="images">Images</h2>
+
+<p>
+These photos show the circuit in action.
+</p>
+
+<img style="margin:1.5em auto" src="images/breadboard.jpg" alt="breadboard prototype" id="figure1" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> Breadboard prototype
+</p>
+
+<img style="margin:1.5em auto" src="images/pcb.jpg" alt="an early run of the PCB" id="figure2" />
+<p class="img-caption">
+ <strong>Figure 2.</strong> An early run of the PCB
+</p>
+
+<img style="margin:1.5em auto" src="images/display.jpg" alt="example display" id="figure3" />
+<p class="img-caption">
+ <strong>Figure 3.</strong> Example display
+</p>
+
+<p>
+This image
+shows the scope display for an unspecified device, software release, and test conditions;
+the results are not typical and cannot be used to extrapolate to other situations.
+</p>
+
+<h2 id="video">Video</h2>
+
+<p>
+This <a href="http://www.youtube.com/watch?v=f95S2IILBJY">Youtube video</a>
+shows the breadboard version testing circuit in operation.
+Skip ahead to 1:00 to see the circuit.
+</p>
diff --git a/en/devices/audio/tv.html b/en/devices/audio/tv.html
new file mode 100644
index 00000000..9f7afc81
--- /dev/null
+++ b/en/devices/audio/tv.html
@@ -0,0 +1,302 @@
+page.title=TV Audio
+@jd:body
+
+<!--
+ Copyright 2015 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>The TV Input Framework (TIF) manager works with the audio routing API to support flexible audio
+path changes. When a System on Chip (SoC) implements the TV hardware abstraction layer (HAL), each
+TV input (HDMI IN, Tuner, etc.) provides <code>TvInputHardwareInfo</code> that specifies AudioPort information for audio type and address.</p>
+
+<ul>
+<li><b>Physical</b> audio input/output devices have a corresponding AudioPort.</li>
+<li><b>Software</b> audio output/input streams are represented as AudioMixPort (child class of
+AudioPort).</li>
+</ul>
+
+<p>The TIF then uses AudioPort information for the audio routing API.</p>
+
+<p><img src="images/ape_audio_tv_tif.png" alt="Android TV Input Framework (TIF)" /></p>
+<p class="img-caption"><strong>Figure 1.</strong> TV Input Framework (TIF)</p>
+
+<h2 id="requirements">Requirements</h2>
+
+<p>A SoC must implement the audio HAL with the following audio routing API support:</p>
+
+<table>
+<tbody>
+<tr>
+<th>Audio Ports</th>
+<td>
+<ul>
+<li>TV Audio Input has a corresponding audio source port implementation.</li>
+<li>TV Audio Output has a corresponding audio sink port implementation.</li>
+<li>Can create audio patch between any TV input audio port and any TV output audio port.</li>
+</ul>
+</td>
+</tr>
+<tr>
+<th>Default Input</th>
+<td>AudioRecord (created with DEFAULT input source) must seize <i>virtual null input source</i> for
+AUDIO_DEVICE_IN_DEFAULT acquisition on Android TV.</td>
+</tr>
+<tr>
+<th>Device Loopback</th>
+<td>Requires supporting an AUDIO_DEVICE_IN_LOOPBACK input that is a complete mix of all audio output
+of all the TV output (11Khz, 16bit mono or 48Khz, 16bit mono). Used only for audio capture.
+</td>
+</tr>
+</tbody>
+</table>
+
+
+<h2 id="audioDevices">TV audio devices</h2>
+
+<p>Android supports the following audio devices for TV audio input/output.</p>
+
+<h4>system/media/audio/include/system/audio.h</h4>
+
+<p class="note"><strong>Note:</strong> In Android 5.1 and earlier, the path to
+this file is: <code>system/core/include/system/audio.h</code></p>
+
+<pre>
+/* output devices */
+AUDIO_DEVICE_OUT_AUX_DIGITAL = 0x400,
+AUDIO_DEVICE_OUT_HDMI = AUDIO_DEVICE_OUT_AUX_DIGITAL,
+/* HDMI Audio Return Channel */
+AUDIO_DEVICE_OUT_HDMI_ARC = 0x40000,
+/* S/PDIF out */
+AUDIO_DEVICE_OUT_SPDIF = 0x80000,
+/* input devices */
+AUDIO_DEVICE_IN_AUX_DIGITAL = AUDIO_DEVICE_BIT_IN | 0x20,
+AUDIO_DEVICE_IN_HDMI = AUDIO_DEVICE_IN_AUX_DIGITAL,
+/* TV tuner input */
+AUDIO_DEVICE_IN_TV_TUNER = AUDIO_DEVICE_BIT_IN | 0x4000,
+/* S/PDIF in */
+AUDIO_DEVICE_IN_SPDIF = AUDIO_DEVICE_BIT_IN | 0x10000,
+AUDIO_DEVICE_IN_LOOPBACK = AUDIO_DEVICE_BIT_IN | 0x40000,
+</pre>
+
+
+<h2 id="halExtension">Audio HAL extension</h2>
+
+<p>The Audio HAL extension for the audio routing API is defined by following:</p>
+
+<h4>system/media/audio/include/system/audio.h</h4>
+
+<p class="note"><strong>Note:</strong> In Android 5.1 and earlier, the path to
+this file is: <code>system/core/include/system/audio.h</code></p>
+
+<pre>
+/* audio port configuration structure used to specify a particular configuration of an audio port */
+struct audio_port_config {
+ audio_port_handle_t id; /* port unique ID */
+ audio_port_role_t role; /* sink or source */
+ audio_port_type_t type; /* device, mix ... */
+ unsigned int config_mask; /* e.g AUDIO_PORT_CONFIG_ALL */
+ unsigned int sample_rate; /* sampling rate in Hz */
+ audio_channel_mask_t channel_mask; /* channel mask if applicable */
+ audio_format_t format; /* format if applicable */
+ struct audio_gain_config gain; /* gain to apply if applicable */
+ union {
+ struct audio_port_config_device_ext device; /* device specific info */
+ struct audio_port_config_mix_ext mix; /* mix specific info */
+ struct audio_port_config_session_ext session; /* session specific info */
+ } ext;
+};
+struct audio_port {
+ audio_port_handle_t id; /* port unique ID */
+ audio_port_role_t role; /* sink or source */
+ audio_port_type_t type; /* device, mix ... */
+ unsigned int num_sample_rates; /* number of sampling rates in following array */
+ unsigned int sample_rates[AUDIO_PORT_MAX_SAMPLING_RATES];
+ unsigned int num_channel_masks; /* number of channel masks in following array */
+ audio_channel_mask_t channel_masks[AUDIO_PORT_MAX_CHANNEL_MASKS];
+ unsigned int num_formats; /* number of formats in following array */
+ audio_format_t formats[AUDIO_PORT_MAX_FORMATS];
+ unsigned int num_gains; /* number of gains in following array */
+ struct audio_gain gains[AUDIO_PORT_MAX_GAINS];
+ struct audio_port_config active_config; /* current audio port configuration */
+ union {
+ struct audio_port_device_ext device;
+ struct audio_port_mix_ext mix;
+ struct audio_port_session_ext session;
+ } ext;
+};
+</pre>
+
+<h4>hardware/libhardware/include/hardware/audio.h</h4>
+
+<pre>
+struct audio_hw_device {
+ :
+ /**
+ * Routing control
+ */
+
+ /* Creates an audio patch between several source and sink ports.
+ * The handle is allocated by the HAL and should be unique for this
+ * audio HAL module. */
+ int (*create_audio_patch)(struct audio_hw_device *dev,
+ unsigned int num_sources,
+ const struct audio_port_config *sources,
+ unsigned int num_sinks,
+ const struct audio_port_config *sinks,
+ audio_patch_handle_t *handle);
+
+ /* Release an audio patch */
+ int (*release_audio_patch)(struct audio_hw_device *dev,
+ audio_patch_handle_t handle);
+
+ /* Fills the list of supported attributes for a given audio port.
+ * As input, "port" contains the information (type, role, address etc...)
+ * needed by the HAL to identify the port.
+ * As output, "port" contains possible attributes (sampling rates, formats,
+ * channel masks, gain controllers...) for this port.
+ */
+ int (*get_audio_port)(struct audio_hw_device *dev,
+ struct audio_port *port);
+
+ /* Set audio port configuration */
+ int (*set_audio_port_config)(struct audio_hw_device *dev,
+ const struct audio_port_config *config);
+</pre>
+
+<h2 id="testing">Testing DEVICE_IN_LOOPBACK</h2>
+
+<p>To test DEVICE_IN_LOOPBACK for TV monitoring, use the following testing code. After running the
+test, the captured audio saves to <code>/sdcard/record_loopback.raw</code>, where you can listen to
+it using <code>ffmeg</code>.</p>
+
+<pre>
+&lt;uses-permission android:name="android.permission.MODIFY_AUDIO_ROUTING" /&gt;
+&lt;uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /&gt;
+
+ AudioRecord mRecorder;
+ Handler mHandler = new Handler();
+ int mMinBufferSize = AudioRecord.getMinBufferSize(RECORD_SAMPLING_RATE,
+ AudioFormat.CHANNEL_IN_MONO,
+ AudioFormat.ENCODING_PCM_16BIT);;
+ static final int RECORD_SAMPLING_RATE = 48000;
+ public void doCapture() {
+ mRecorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, RECORD_SAMPLING_RATE,
+ AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, mMinBufferSize * 10);
+ AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
+ ArrayList&lt;AudioPort&gt; audioPorts = new ArrayList&lt;AudioPort&gt;();
+ am.listAudioPorts(audioPorts);
+ AudioPortConfig srcPortConfig = null;
+ AudioPortConfig sinkPortConfig = null;
+ for (AudioPort audioPort : audioPorts) {
+ if (srcPortConfig == null
+ &amp;&amp; audioPort.role() == AudioPort.ROLE_SOURCE
+ &amp;&amp; audioPort instanceof AudioDevicePort) {
+ AudioDevicePort audioDevicePort = (AudioDevicePort) audioPort;
+ if (audioDevicePort.type() == AudioManager.DEVICE_IN_LOOPBACK) {
+ srcPortConfig = audioPort.buildConfig(48000, AudioFormat.CHANNEL_IN_DEFAULT,
+ AudioFormat.ENCODING_DEFAULT, null);
+ Log.d(LOG_TAG, "Found loopback audio source port : " + audioPort);
+ }
+ }
+ else if (sinkPortConfig == null
+ &amp;&amp; audioPort.role() == AudioPort.ROLE_SINK
+ &amp;&amp; audioPort instanceof AudioMixPort) {
+ sinkPortConfig = audioPort.buildConfig(48000, AudioFormat.CHANNEL_OUT_DEFAULT,
+ AudioFormat.ENCODING_DEFAULT, null);
+ Log.d(LOG_TAG, "Found recorder audio mix port : " + audioPort);
+ }
+ }
+ if (srcPortConfig != null &amp;&amp; sinkPortConfig != null) {
+ AudioPatch[] patches = new AudioPatch[] { null };
+ int status = am.createAudioPatch(
+ patches,
+ new AudioPortConfig[] { srcPortConfig },
+ new AudioPortConfig[] { sinkPortConfig });
+ Log.d(LOG_TAG, "Result of createAudioPatch(): " + status);
+ }
+ mRecorder.startRecording();
+ processAudioData();
+ mRecorder.stop();
+ mRecorder.release();
+ }
+ private void processAudioData() {
+ OutputStream rawFileStream = null;
+ byte data[] = new byte[mMinBufferSize];
+ try {
+ rawFileStream = new BufferedOutputStream(
+ new FileOutputStream(new File("/sdcard/record_loopback.raw")));
+ } catch (FileNotFoundException e) {
+ Log.d(LOG_TAG, "Can't open file.", e);
+ }
+ long startTimeMs = System.currentTimeMillis();
+ while (System.currentTimeMillis() - startTimeMs &lt; 5000) {
+ int nbytes = mRecorder.read(data, 0, mMinBufferSize);
+ if (nbytes &lt;= 0) {
+ continue;
+ }
+ try {
+ rawFileStream.write(data);
+ } catch (IOException e) {
+ Log.e(LOG_TAG, "Error on writing raw file.", e);
+ }
+ }
+ try {
+ rawFileStream.close();
+ } catch (IOException e) {
+ }
+ Log.d(LOG_TAG, "Exit audio recording.");
+ }
+</pre>
+
+<p>Locate the captured audio file in <code>/sdcard/record_loopback.raw</code> and listen to it using
+<code>ffmeg</code>:</p>
+
+<pre>
+adb pull /sdcard/record_loopback.raw
+ffmpeg -f s16le -ar 48k -ac 1 -i record_loopback.raw record_loopback.wav
+ffplay record_loopback.wav
+</pre>
+
+<h2 id="useCases">Use cases</h2>
+
+<p>This section includes common use cases for TV audio.</p>
+
+<h3 id="tvSpeaker">TV tuner with speaker output</h3>
+
+<p>When a TV tuner becomes active, the audio routing API creates an audio patch between the tuner
+and the default output (e.g. the speaker). The tuner output does not require decoding, but final
+audio output is mixed with software output_stream.</p>
+
+<img src="images/ape_audio_tv_tuner.png" alt="Android TV Tuner Audio Patch" />
+<p class="img-caption">
+<strong>Figure 2.</strong> Audio Patch for TV tuner with speaker output.</p>
+
+
+<h3 id="hdmiOut">HDMI OUT during live TV</h3>
+
+<p>A user is watching live TV then switches to the HDMI audio output (Intent.ACTION_HDMI_AUDIO_PLUG)
+. The output device of all output_streams changes to the HDMI_OUT port, and the TIF manager changes
+the sink port of the existing tuner audio patch to the HDMI_OUT port.</p>
+
+<img src="images/ape_audio_tv_hdmi_tuner.png" alt="Android TV HDMI-OUT Audio Patch" />
+<p class="img-caption">
+<strong>Figure 3.</strong> Audio Patch for HDMI OUT from live TV.</p>
diff --git a/en/devices/audio/usb.html b/en/devices/audio/usb.html
new file mode 100644
index 00000000..bb0bb69e
--- /dev/null
+++ b/en/devices/audio/usb.html
@@ -0,0 +1,632 @@
+page.title=USB Digital Audio
+@jd:body
+
+<!--
+ Copyright 2014 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+This article reviews Android support for USB digital audio and related
+USB-based protocols.
+</p>
+
+<h3 id="audience">Audience</h3>
+
+<p>
+The target audience of this article is Android device OEMs, SoC vendors,
+USB audio peripheral suppliers, advanced audio application developers,
+and others seeking detailed understanding of USB digital audio internals on Android.
+</p>
+
+<p>
+End users of Nexus devices should see the article
+<a href="https://support.google.com/nexus/answer/6127700">Record and play back audio using USB host mode</a>
+at the
+<a href="https://support.google.com/nexus/">Nexus Help Center</a> instead.
+Though this article is not oriented towards end users,
+certain audiophile consumers may find portions of interest.
+</p>
+
+<h2 id="overview">Overview of USB</h2>
+
+<p>
+Universal Serial Bus (USB) is informally described in the Wikipedia article
+<a href="http://en.wikipedia.org/wiki/USB">USB</a>,
+and is formally defined by the standards published by the
+<a href="http://www.usb.org/">USB Implementers Forum, Inc</a>.
+For convenience, we summarize the key USB concepts here,
+but the standards are the authoritative reference.
+</p>
+
+<h3 id="terminology">Basic concepts and terminology</h3>
+
+<p>
+USB is a <a href="http://en.wikipedia.org/wiki/Bus_(computing)">bus</a>
+with a single initiator of data transfer operations, called the <i>host</i>.
+The host communicates with
+<a href="http://en.wikipedia.org/wiki/Peripheral">peripherals</a> via the bus.
+</p>
+
+<p class="note"><strong>Note:</strong> The terms <i>device</i> and <i>accessory</i> are common synonyms for
+<i>peripheral</i>. We avoid those terms here, as they could be confused with
+Android <a href="http://en.wikipedia.org/wiki/Mobile_device">device</a>
+or the Android-specific concept called
+<a href="http://developer.android.com/guide/topics/connectivity/usb/accessory.html">accessory mode</a>.
+</p>
+
+<p>
+A critical host role is <i>enumeration</i>:
+the process of detecting which peripherals are connected to the bus,
+and querying their properties expressed via <i>descriptors</i>.
+</p>
+
+<p>
+A peripheral may be one physical object
+but actually implement multiple logical <i>functions</i>.
+For example, a webcam peripheral could have both a camera function and a
+microphone audio function.
+</p>
+
+<p>
+Each peripheral function has an <i>interface</i> that
+defines the protocol to communicate with that function.
+</p>
+
+<p>
+The host communicates with a peripheral over a
+<a href="http://en.wikipedia.org/wiki/Stream_(computing)">pipe</a>
+to an <a href="http://en.wikipedia.org/wiki/Communication_endpoint">endpoint</a>,
+a data source or sink
+provided by one of the peripheral's functions.
+</p>
+
+<p>
+There are two kinds of pipes: <i>message</i> and <i>stream</i>.
+A message pipe is used for bi-directional control and status.
+A stream pipe is used for uni-directional data transfer.
+</p>
+
+<p>
+The host initiates all data transfers,
+hence the terms <i>input</i> and <i>output</i> are expressed relative to the host.
+An input operation transfers data from the peripheral to the host,
+while an output operation transfers data from the host to the peripheral.
+</p>
+
+<p>
+There are three major data transfer modes:
+<i>interrupt</i>, <i>bulk</i>, and <i>isochronous</i>.
+Isochronous mode will be discussed further in the context of audio.
+</p>
+
+<p>
+The peripheral may have <i>terminals</i> that connect to the outside world,
+beyond the peripheral itself. In this way, the peripheral serves
+to translate between USB protocol and "real world" signals.
+The terminals are logical objects of the function.
+</p>
+
+<h2 id="androidModes">Android USB modes</h2>
+
+<h3 id="developmentMode">Development mode</h3>
+
+<p>
+<i>Development mode</i> has been present since the initial release of Android.
+The Android device appears as a USB peripheral
+to a host PC running a desktop operating system such as Linux,
+Mac OS X, or Windows. The only visible peripheral function is either
+<a href="http://en.wikipedia.org/wiki/Android_software_development#Fastboot">Android fastboot</a>
+or
+<a href="http://developer.android.com/tools/help/adb.html">Android Debug Bridge (adb)</a>.
+The fastboot and adb protocols are layered over USB bulk data transfer mode.
+</p>
+
+<h3 id="hostMode">Host mode</h3>
+
+<p>
+<i>Host mode</i> is introduced in Android 3.1 (API level 12).
+</p>
+
+<p>
+As the Android device must act as host, and most Android devices include
+a micro-USB connector that does not directly permit host operation,
+an on-the-go (<a href="http://en.wikipedia.org/wiki/USB_On-The-Go">OTG</a>) adapter
+such as this is usually required:
+</p>
+
+<img src="images/otg.jpg" style="image-orientation: 90deg;" height="50%" width="50%" alt="OTG" id="figure1" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> On-the-go (OTG) adapter
+</p>
+
+
+<p>
+An Android device might not provide sufficient power to operate a
+particular peripheral, depending on how much power the peripheral needs,
+and how much the Android device is capable of supplying. Even if
+adequate power is available, the Android device battery charge may
+be significantly shortened. For these situations, use a powered
+<a href="http://en.wikipedia.org/wiki/USB_hub">hub</a> such as this:
+</p>
+
+<img src="images/hub.jpg" alt="Powered hub" id="figure2" />
+<p class="img-caption">
+ <strong>Figure 2.</strong> Powered hub
+</p>
+
+<h3 id="accessoryMode">Accessory mode</h3>
+
+<p>
+<i>Accessory mode</i> was introduced in Android 3.1 (API level 12) and back-ported to Android 2.3.4.
+In this mode, the Android device operates as a USB peripheral,
+under the control of another device such as a dock that serves as host.
+The difference between development mode and accessory mode
+is that additional USB functions are visible to the host, beyond adb.
+The Android device begins in development mode and then
+transitions to accessory mode via a re-negotiation process.
+</p>
+
+<p>
+Accessory mode was extended with additional features in Android 4.1,
+in particular audio described below.
+</p>
+
+<h2 id="usbAudio">USB audio</h2>
+
+<h3 id="class">USB classes</h3>
+
+<p>
+Each peripheral function has an associated <i>device class</i> document
+that specifies the standard protocol for that function.
+This enables <i>class compliant</i> hosts and peripheral functions
+to inter-operate, without detailed knowledge of each other's workings.
+Class compliance is critical if the host and peripheral are provided by
+different entities.
+</p>
+
+<p>
+The term <i>driverless</i> is a common synonym for <i>class compliant</i>,
+indicating that it is possible to use the standard features of such a
+peripheral without requiring an operating-system specific
+<a href="http://en.wikipedia.org/wiki/Device_driver">driver</a> to be installed.
+One can assume that a peripheral advertised as "no driver needed"
+for major desktop operating systems
+will be class compliant, though there may be exceptions.
+</p>
+
+<h3 id="audioClass">USB audio class</h3>
+
+<p>
+Here we concern ourselves only with peripherals that implement
+audio functions, and thus adhere to the audio device class. There are two
+editions of the USB audio class specification: class 1 (UAC1) and 2 (UAC2).
+</p>
+
+<h3 id="otherClasses">Comparison with other classes</h3>
+
+<p>
+USB includes many other device classes, some of which may be confused
+with the audio class. The
+<a href="http://en.wikipedia.org/wiki/USB_mass_storage_device_class">mass storage class</a>
+(MSC) is used for
+sector-oriented access to media, while
+<a href="http://en.wikipedia.org/wiki/Media_Transfer_Protocol">Media Transfer Protocol</a>
+(MTP) is for full file access to media.
+Both MSC and MTP may be used for transferring audio files,
+but only USB audio class is suitable for real-time streaming.
+</p>
+
+<h3 id="audioTerminals">Audio terminals</h3>
+
+<p>
+The terminals of an audio peripheral are typically analog.
+The analog signal presented at the peripheral's input terminal is converted to digital by an
+<a href="http://en.wikipedia.org/wiki/Analog-to-digital_converter">analog-to-digital converter</a>
+(ADC),
+and is carried over USB protocol to be consumed by
+the host. The ADC is a data <i>source</i>
+for the host. Similarly, the host sends a
+digital audio signal over USB protocol to the peripheral, where a
+<a href="http://en.wikipedia.org/wiki/Digital-to-analog_converter">digital-to-analog converter</a>
+(DAC)
+converts and presents to an analog output terminal.
+The DAC is a <i>sink</i> for the host.
+</p>
+
+<h3 id="channels">Channels</h3>
+
+<p>
+A peripheral with audio function can include a source terminal, sink terminal, or both.
+Each direction may have one channel (<i>mono</i>), two channels
+(<i>stereo</i>), or more.
+Peripherals with more than two channels are called <i>multichannel</i>.
+It is common to interpret a stereo stream as consisting of
+<i>left</i> and <i>right</i> channels, and by extension to interpret a multichannel stream as having
+spatial locations corresponding to each channel. However, it is also quite appropriate
+(especially for USB audio more so than
+<a href="http://en.wikipedia.org/wiki/HDMI">HDMI</a>)
+to not assign any particular
+standard spatial meaning to each channel. In this case, it is up to the
+application and user to define how each channel is used.
+For example, a four-channel USB input stream might have the first three
+channels attached to various microphones within a room, and the final
+channel receiving input from an AM radio.
+</p>
+
+<h3 id="isochronous">Isochronous transfer mode</h3>
+
+<p>
+USB audio uses isochronous transfer mode for its real-time characteristics,
+at the expense of error recovery.
+In isochronous mode, bandwidth is guaranteed, and data transmission
+errors are detected using a cyclic redundancy check (CRC). But there is
+no packet acknowledgement or re-transmission in the event of error.
+</p>
+
+<p>
+Isochronous transmissions occur each Start Of Frame (SOF) period.
+The SOF period is one millisecond for full-speed, and 125 microseconds for
+high-speed. Each full-speed frame carries up to 1023 bytes of payload,
+and a high-speed frame carries up to 1024 bytes. Putting these together,
+we calculate the maximum transfer rate as 1,023,000 or 8,192,000 bytes
+per second. This sets a theoretical upper limit on the combined audio
+sample rate, channel count, and bit depth. The practical limit is lower.
+</p>
+
+<p>
+Within isochronous mode, there are three sub-modes:
+</p>
+
+<ul>
+<li>Adaptive</li>
+<li>Asynchronous</li>
+<li>Synchronous</li>
+</ul>
+
+<p>
+In adaptive sub-mode, the peripheral sink or source adapts to a potentially varying sample rate
+of the host.
+</p>
+
+<p>
+In asynchronous (also called implicit feedback) sub-mode,
+the sink or source determines the sample rate, and the host accommodates.
+The primary theoretical advantage of asynchronous sub-mode is that the source
+or sink USB clock is physically and electrically closer to (and indeed may
+be the same as, or derived from) the clock that drives the DAC or ADC.
+This proximity means that asynchronous sub-mode should be less susceptible
+to clock jitter. In addition, the clock used by the DAC or ADC may be
+designed for higher accuracy and lower drift than the host clock.
+</p>
+
+<p>
+In synchronous sub-mode, a fixed number of bytes is transferred each SOF period.
+The audio sample rate is effectively derived from the USB clock.
+Synchronous sub-mode is not commonly used with audio because both
+host and peripheral are at the mercy of the USB clock.
+</p>
+
+<p>
+The table below summarizes the isochronous sub-modes:
+</p>
+
+<table>
+<tr>
+ <th>Sub-mode</th>
+ <th>Byte count<br />per packet</th>
+ <th>Sample rate<br />determined by</th>
+ <th>Used for audio</th>
+</tr>
+<tr>
+ <td>adaptive</td>
+ <td>variable</td>
+ <td>host</td>
+ <td>yes</td>
+</tr>
+<tr>
+ <td>asynchronous</td>
+ <td>variable</td>
+ <td>peripheral</td>
+ <td>yes</td>
+</tr>
+<tr>
+ <td>synchronous</td>
+ <td>fixed</td>
+ <td>USB clock</td>
+ <td>no</td>
+</tr>
+</table>
+
+<p>
+In practice, the sub-mode does of course matter, but other factors
+should also be considered.
+</p>
+
+<h2 id="androidSupport">Android support for USB audio class</h2>
+
+<h3 id="developmentAudio">Development mode</h3>
+
+<p>
+USB audio is not supported in development mode.
+</p>
+
+<h3 id="hostAudio">Host mode</h3>
+
+<p>
+Android 5.0 (API level 21) and above supports a subset of USB audio class 1 (UAC1) features:
+</p>
+
+<ul>
+<li>The Android device must act as host</li>
+<li>The audio format must be PCM (interface type I)</li>
+<li>The bit depth must be 16-bits, 24-bits, or 32-bits where
+24 bits of useful audio data are left-justified within the most significant
+bits of the 32-bit word</li>
+<li>The sample rate must be either 48, 44.1, 32, 24, 22.05, 16, 12, 11.025, or 8 kHz</li>
+<li>The channel count must be 1 (mono) or 2 (stereo)</li>
+</ul>
+
+<p>
+Perusal of the Android framework source code may show additional code
+beyond the minimum needed to support these features. But this code
+has not been validated, so more advanced features are not yet claimed.
+</p>
+
+<h3 id="accessoryAudio">Accessory mode</h3>
+
+<p>
+Android 4.1 (API level 16) added limited support for audio playback to the host.
+While in accessory mode, Android automatically routes its audio output to USB.
+That is, the Android device serves as a data source to the host, for example a dock.
+</p>
+
+<p>
+Accessory mode audio has these features:
+</p>
+
+<ul>
+<li>
+The Android device must be controlled by a knowledgeable host that
+can first transition the Android device from development mode to accessory mode,
+and then the host must transfer audio data from the appropriate endpoint.
+Thus the Android device does not appear "driverless" to the host.
+</li>
+<li>The direction must be <i>input</i>, expressed relative to the host</li>
+<li>The audio format must be 16-bit PCM</li>
+<li>The sample rate must be 44.1 kHz</li>
+<li>The channel count must be 2 (stereo)</li>
+</ul>
+
+<p>
+Accessory mode audio has not been widely adopted,
+and is not currently recommended for new designs.
+</p>
+
+<h2 id="applications">Applications of USB digital audio</h2>
+
+<p>
+As the name indicates, the USB digital audio signal is represented
+by a <a href="http://en.wikipedia.org/wiki/Digital_data">digital</a> data stream
+rather than the <a href="http://en.wikipedia.org/wiki/Analog_signal">analog</a>
+signal used by the common TRS mini
+<a href="http://en.wikipedia.org/wiki/Phone_connector_(audio)">headset connector</a>.
+Eventually any digital signal must be converted to analog before it can be heard.
+There are tradeoffs in choosing where to place that conversion.
+</p>
+
+<h3 id="comparison">A tale of two DACs</h3>
+
+<p>
+In the example diagram below, we compare two designs. First we have a
+mobile device with Application Processor (AP), on-board DAC, amplifier,
+and analog TRS connector attached to headphones. We also consider a
+mobile device with USB connected to external USB DAC and amplifier,
+also with headphones.
+</p>
+
+<img src="images/dac.png" alt="DAC comparison" id="figure3" />
+<p class="img-caption">
+ <strong>Figure 3.</strong> Comparison of two DACs
+</p>
+
+<p>
+Which design is better? The answer depends on your needs.
+Each has advantages and disadvantages.
+</p>
+<p class="note"><strong>Note:</strong> This is an artificial comparison, since
+a real Android device would probably have both options available.
+</p>
+
+<p>
+The first design A is simpler, less expensive, uses less power,
+and will be a more reliable design assuming otherwise equally reliable components.
+However, there are usually audio quality tradeoffs vs. other requirements.
+For example, if this is a mass-market device, it may be designed to fit
+the needs of the general consumer, not for the audiophile.
+</p>
+
+<p>
+In the second design, the external audio peripheral C can be designed for
+higher audio quality and greater power output without impacting the cost of
+the basic mass market Android device B. Yes, it is a more expensive design,
+but the cost is absorbed only by those who want it.
+</p>
+
+<p>
+Mobile devices are notorious for having high-density
+circuit boards, which can result in more opportunities for
+<a href="http://en.wikipedia.org/wiki/Crosstalk_(electronics)">crosstalk</a>
+that degrades adjacent analog signals. Digital communication is less susceptible to
+<a href="http://en.wikipedia.org/wiki/Noise_(electronics)">noise</a>,
+so moving the DAC from the Android device A to an external circuit board
+C allows the final analog stages to be physically and electrically
+isolated from the dense and noisy circuit board, resulting in higher fidelity audio.
+</p>
+
+<p>
+On the other hand,
+the second design is more complex, and with added complexity come more
+opportunities for things to fail. There is also additional latency
+from the USB controllers.
+</p>
+
+<h3 id="hostApplications">Host mode applications</h3>
+
+<p>
+Typical USB host mode audio applications include:
+</p>
+
+<ul>
+<li>music listening</li>
+<li>telephony</li>
+<li>instant messaging and voice chat</li>
+<li>recording</li>
+</ul>
+
+<p>
+For all of these applications, Android detects a compatible USB digital
+audio peripheral, and automatically routes audio playback and capture
+appropriately, based on the audio policy rules.
+Stereo content is played on the first two channels of the peripheral.
+</p>
+
+<p>
+There are no APIs specific to USB digital audio.
+For advanced usage, the automatic routing may interfere with applications
+that are USB-aware. For such applications, disable automatic routing
+via the corresponding control in the Media section of
+<a href="http://developer.android.com/tools/index.html">Settings / Developer Options</a>.
+</p>
+
+<h3 id="hostDebugging">Debugging while in host mode</h3>
+
+<p>
+While in USB host mode, adb debugging over USB is unavailable.
+See section <a href="http://developer.android.com/tools/help/adb.html#wireless">Wireless usage</a>
+of
+<a href="http://developer.android.com/tools/help/adb.html">Android Debug Bridge</a>
+for an alternative.
+</p>
+
+<h2 id="compatibility">Implementing USB audio</h2>
+
+<h3 id="recommendationsPeripheral">Recommendations for audio peripheral vendors</h3>
+
+<p>
+In order to inter-operate with Android devices, audio peripheral vendors should:
+</p>
+
+<ul>
+<li>design for audio class compliance;
+currently Android targets class 1, but it is wise to plan for class 2</li>
+<li>avoid <a href="http://en.wiktionary.org/wiki/quirk">quirks</a></li>
+<li>test for inter-operability with reference and popular Android devices</li>
+<li>clearly document supported features, audio class compliance, power requirements, etc.
+so that consumers can make informed decisions</li>
+</ul>
+
+<h3 id="recommendationsAndroid">Recommendations for Android device OEMs and SoC vendors</h3>
+
+<p>
+In order to support USB digital audio, device OEMs and SoC vendors should:
+</p>
+
+<ul>
+<li>design hardware to support USB host mode</li>
+<li>enable generic USB host support at the framework level
+via the <code>android.hardware.usb.host.xml</code> feature flag</li>
+<li>enable all kernel features needed: USB host mode, USB audio, isochronous transfer mode;
+see <a href="{@docRoot}devices/tech/kernel.html">Android Kernel Configuration</a></li>
+<li>keep up-to-date with recent kernel releases and patches;
+despite the noble goal of class compliance, there are extant audio peripherals
+with <a href="http://en.wiktionary.org/wiki/quirk">quirks</a>,
+and recent kernels have workarounds for such quirks
+</li>
+<li>enable USB audio policy as described below</li>
+<li>add audio.usb.default to PRODUCT_PACKAGES in device.mk</li>
+<li>test for inter-operability with common USB audio peripherals</li>
+</ul>
+
+<h3 id="enable">How to enable USB audio policy</h3>
+
+<p>
+To enable USB audio, add an entry to the
+audio policy configuration file. This is typically
+located here:
+</p>
+<pre>device/oem/codename/audio_policy.conf</pre>
+<p>
+The pathname component "oem" should be replaced by the name
+of the OEM who manufactures the Android device,
+and "codename" should be replaced by the device code name.
+</p>
+
+<p>
+An example entry is shown here:
+</p>
+
+<pre>
+audio_hw_modules {
+ ...
+ usb {
+ outputs {
+ usb_accessory {
+ sampling_rates 44100
+ channel_masks AUDIO_CHANNEL_OUT_STEREO
+ formats AUDIO_FORMAT_PCM_16_BIT
+ devices AUDIO_DEVICE_OUT_USB_ACCESSORY
+ }
+ usb_device {
+ sampling_rates dynamic
+ channel_masks dynamic
+ formats dynamic
+ devices AUDIO_DEVICE_OUT_USB_DEVICE
+ }
+ }
+ inputs {
+ usb_device {
+ sampling_rates dynamic
+ channel_masks AUDIO_CHANNEL_IN_STEREO
+ formats AUDIO_FORMAT_PCM_16_BIT
+ devices AUDIO_DEVICE_IN_USB_DEVICE
+ }
+ }
+ }
+ ...
+}
+</pre>
+
+<h3 id="sourceCode">Source code</h3>
+
+<p>
+The audio Hardware Abstraction Layer (HAL)
+implementation for USB audio is located here:
+</p>
+<pre>hardware/libhardware/modules/usbaudio/</pre>
+<p>
+The USB audio HAL relies heavily on
+<i>tinyalsa</i>, described at <a href="terminology.html">Audio Terminology</a>.
+Though USB audio relies on isochronous transfers,
+this is abstracted away by the ALSA implementation.
+So the USB audio HAL and tinyalsa do not need to concern
+themselves with this part of USB protocol.
+</p>
diff --git a/en/devices/audio/warmup.html b/en/devices/audio/warmup.html
new file mode 100644
index 00000000..1dec8342
--- /dev/null
+++ b/en/devices/audio/warmup.html
@@ -0,0 +1,114 @@
+page.title=Audio Warmup
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>Audio warmup is the time it takes for the audio amplifier circuit in your device to
+be fully powered and reach its normal operation state. The major contributors
+to audio warmup time are power management and any "de-pop" logic to stabilize
+the circuit.
+</p>
+
+<p>This document describes how to measure audio warmup time and possible ways to decrease
+warmup time.</p>
+
+<h2 id="measuringOutput">Measuring Output Warmup</h2>
+
+<p>
+ AudioFlinger's FastMixer thread automatically measures output warmup
+ and reports it as part of the output of the <code>dumpsys media.audio_flinger</code> command.
+ At warmup, FastMixer calls <code>write()</code>
+ repeatedly until the time between two <code>write()</code>s is the amount expected.
+ FastMixer determines audio warmup by seeing how long a Hardware Abstraction
+Layer (HAL) <code>write()</code> takes to stabilize.
+</p>
+
+<p>To measure audio warmup, follow these steps for the built-in speaker and wired headphones
+ and at different times after booting. Warmup times are usually different for each output device
+ and right after booting the device:</p>
+
+<ol>
+ <li>Ensure that FastMixer is enabled.</li>
+ <li>Enable touch sounds by selecting <b>Settings > Sound > Touch sounds</b> on the device.</li>
+ <li>Ensure that audio has been off for at least three seconds. Five seconds or more is better, because
+ the hardware itself might have its own power logic beyond the three seconds that AudioFlinger has.</li>
+ <li>Press Home, and you should hear a click sound.</li>
+ <li>Run the following command to receive the measured warmup:
+ <br /><code>adb shell dumpsys media.audio_flinger | grep measuredWarmup</code>
+
+<p>
+You should see output like this:
+</p>
+
+<pre>
+sampleRate=44100 frameCount=256 measuredWarmup=X ms, warmupCycles=X
+</pre>
+
+<p>
+ The <code>measuredWarmup=X</code> is X number of milliseconds
+ it took for the first set of HAL <code>write()</code>s to complete.
+</p>
+
+<p>
+ The <code>warmupCycles=X</code> is how many HAL write requests it took
+ until the execution time of <code>write()</code> matches what is expected.
+</p>
+</li>
+<li>
+ Take five measurements and record them all, as well as the mean.
+ If they are not all approximately the same,
+ then it's likely that a measurement is incorrect.
+ For example, if you don't wait long enough after the audio has been off,
+ you will see a lower warmup time than the mean value.
+</li>
+</ol>
+
+
+<h2 id="measuringInput">Measuring Input Warmup</h2>
+
+<p>
+ There are currently no tools provided for measuring audio input warmup.
+ However, input warmup time can be estimated by observing
+ the time required for <a href="http://developer.android.com/reference/android/media/AudioRecord.html#startRecording()">startRecording()</a>
+ to return.
+</p>
+
+
+<h2 id="reducing">Reducing Warmup Time</h2>
+
+<p>
+ Warmup time can usually be reduced by a combination of:
+</p>
+ <ul>
+ <li>Good circuit design</li>
+ <li>Accurate time delays in kernel device driver</li>
+ <li>Performing independent warmup operations concurrently rather than sequentially</li>
+ <li>Leaving circuits powered on or not reconfiguring clocks (increases idle power consumption)</li>
+ <li>Caching computed parameters</li>
+ </ul>
+<p>
+ However, beware of excessive optimization. You may find that you
+ need to tradeoff between low warmup time versus
+ lack of popping at power transitions.
+</p>