Audio latency is defined as the time delay that a signal experiences as it passes through a system. On a mobile device, this is deeply related to how long it takes between tapping on a screen and receiving audio feedback. Low audio latency can be the difference between an immersive gaming experience and an unpleasant, disconnected one. Too long a latency and a device can begin to feel strangely laggy, even if every visual animation is snappy and responsive. It is especially important — essential, even — for recording and composing music, since slow audio feedback can easily throw off even the best artists and destroy their creative process. Low latency is absolutely fundamental in a modern operating system, and it's been absolutely terrible on Android.

There are several ways of measuring latency, and a very useful and easily understood one is called round-trip latency. Round-trip latency is calculated by inputting an audio signal into a device and measuring the amount of time it takes for the signal to exit the output. The Nexus One, for example, had a round-trip latency of as much as 350 milliseconds (ms). For comparison, the average reaction time of a human to an audio stimulus is around 170 ms. In other words, — while unquestionably impractical — a Nexus One made with human ears would be more than twice as responsive as the Nexus One we know today.

What's the big deal?

Besides making the whole OS feel laggier and slower to respond, the audio latency was also prohibitively high for many companies to invest the money and effort into bringing any kind of music creation app to Android. Many developers simply gave up on the idea of bringing their music app to Android, with some claiming that it was simply "not possible to play music on an Android phone." While that may have been somewhat of an exaggeration, it did unquestionably require more time to code in lots of low-level functionalities that were simply missing from Android's Software Development Kit — and even after all that work, the end result was still far from ideal and considerably worse than the experience on other platforms.

On the other hand, iOS had all the requirements for a thriving music creation ecosystem. While the success of the iPad was arguably a large contributing factor (since composing music on a tablet is an incomparably better experience than on a small phone screen), it simply would not have been possible for iOS to reach the popularity it has today within the music industry if it hadn't solved the issue of audio latency first.

Superpowered, a company that makes an audio SDK that works across multiple platforms, compiled data from both Google Play and the App Store and concluded that, even though music apps only accounted for 3% of app downloads on iOS in Q1 of 2015, the Music app category was the third highest revenue generating category in the App Store. On Google Play, the Music category is neither in the top five categories by downloads nor by revenue. Because of this, Superpowered estimate that many millions of dollars that could have been generated by Android's 1 billion users are instead being left for Apple and iOS developers to grab.

Unfortunately, iOS has always been way ahead of Android in terms of audio latency. Even iOS devices as old as the iPhone 4S (and much older) have average latency levels of about 7 ms. That's fifty times less than the average latency for Android devices in 2011. That much of a difference can easily translate to a comparatively horrible experience on Android, to the point where it simply became comically bad. In the video below, you can see a comparison of two drum kit apps running on both an iPad and an Android tablet on Ice Cream Sandwich. Listen out for the lag between the tap on the screen and the sound of the snare and cymbals.

It's easy to see how such a long lag in audio feedback can cripple any hopes someone might have in using Android to create music. Whereas it is almost impossible to detect a latency of much less than 10 ms, a delay of several tenths of a second is almost impossible not to hear. The embedded SoundCloud playlist below contains four different tracks with varying delays between a metronome tick and a keyboard note, ranging from no latency at all to latencies of 5.8 ms, 108.8 ms (which was reportedly the best for an Android device in 2011), and 371.5 ms. As hard as I try, I cannot discern any difference between the track with the real-time audio and the one with the 5.8 ms delay.

https://soundcloud.com/musique-tactile/sets/latency-comparison-between-ios

But why exactly does iOS have such an unfair advantage over Android? The reason has to do with something Apple calls its "Core Audio" infrastructure. Basically, Core Audio is comprised of a set of frameworks that allow for a lot of overhead involved in audio processing to be shortened or dropped out entirely. Core Audio has actually been around for much longer than the iPhone has: it was initially developed for OS X Panther, way back in 2003. Because of this, Apple was able to take what it already had available on its desktop operating system and port it over to the first version of iPhone OS, as it was called back then.

To their credit, Google haven't been slacking off either. Things have improved progressively over the years, and the jump to Lollipop alone was responsible for slashing latency by up to two thirds. A Nexus 4 running Android 4.2.2 had an estimated audio latency of about 195 ms, and the upgrade to version 5.1 brought that down to just 58 ms.

While this was still not enough to match iOS, the reduced audio latency on Lollipop was enough to convince major developers like edjing to release their popular music creation apps on Android. Even so, latency values in the 50 ms range are still easily detected by the human ear, and there is still work to be done.

The times they are a-changin'

But there's light at the end of the tunnel, and music at the end of your tangled headphone cables. After having achieved promising results with the release of Lollipop, Google has managed to significantly cut down on latency again on Marshmallow. The Nexus 9 saw the largest gains, dropping from 32 ms in Android 5.1.1 down to just 15 ms in 6.0. The Nexus 5X and the Nexus 6P both have very acceptable levels of 18 ms, and if you have a look on Android.com, you'll find a collection of other values for round-trip latency on multiple Nexus devices for several versions of Android.

It's interesting to test out the round-trip latency on your own device, and it's actually very easy to do so using a small, open-source app made by Superpowered. You can use the direct link to download and install the APK file (or compile it from source if you're suspicious) to do your own measurements and compare them to other results. (Note: in our tests, the numbers given by the app were consistently larger than the ones provided by Google. We're not sure why this is, but you should bear that in mind when trying out the app.) Superpowered also go into plenty more detail on their website and break down each step that an audio signal has to go through in a round-trip latency test.

When is 'good' good enough?

The audio latency in Android has already fallen to well within acceptable levels, and if recent history is anything to go by, we'll likely be hitting the target value of 10 ms for professional audio applications by the time Android N (Nutella? Nougat?) rolls out. But what happens after that? Is it still worth investing in continuing to reduce latency down to, say, 5 ms, 1 ms, or even less?

The short answer — unlike almost every other spec companies like to fight over — is no. At least for just about every application you can come up with, the goal is not to reach 0 ms latency: that simply does not exist.

The fact of the matter is that sound (along with everything else we know of in the Universe) travels at a finite speed. In regular conditions, a sound wave propagates at around 340 meters (or 1100 feet) through the air in a single second. This means that even for plain old physical instruments, there exists an audio delay between the moment the instrument is played and the instant the sound reaches the musician's ear. This isn't due to some flaw in the instrument's design; it's just a simple consequence of the laws of physics for a mechanical wave.

For a violin — which is held just a few inches away from the player's ear — this delay translates to around 0.5 ms, and for larger instruments which are played farther away (like a piano, guitar, or drum set), this can easily reach 3 ms or more. This means that for someone like a pianist, playing a digital piano directly into their headphones at zero audio latency would effectively feel like listening to themselves play the piano in the future.

On top of that, there's a limit beyond which humans can no longer detect latency, and that threshold is actually much higher. As a rule, we begin to perceive a sound as separate (or in a sense, to hear an echo) for latencies of over 20 ms, and we start to "feel" some sort of lag or artifact at around 12 ms. Even for professional musicians, the boundary below which latency becomes completely irrelevant is not much lower.

In a study published by the Audio Engineering Society, researchers attempted to determine the lowest latency detectable by different kinds of musicians. Basically, their goal was to perform a sort of round-trip latency test on humans: a musician would sing or play one of several electric instruments and the sound would be played back into their headphones with different levels of audio delay.

What they found was a set of values below which absolutely no kind of delay or artifact was detected at all. With an 80% confidence level, this value was at least 28 ms for keyboards, whereas for drums, guitars, and bass, it was 9 ms, 5 ms, and 5 ms, respectively. Predictably, the lowest value found was for vocals, where singers only began to notice some slight artifacts at around 2 ms. (The study also found that the threshold for saxophones was at about 1 ms, but since the sample size was small and given that sound takes at least 2 ms to travel from the end of a 25 inch saxophone to the ear of the saxophone player, researchers concluded that more data would be needed to obtain an accurate result.)

Using this data, we drew up another chart to compare these values with several Nexus devices running different versions of Android, as well as the iPhone 6, the iPad Air 2, and human reaction times to various kinds of stimuli. The red and green dashed lines represent the typical thresholds for detecting audio lags and for perceiving audio artifacts, respectively. Go ahead and click on the image below to enlarge it.

latency2-01

While it's clear that OS updates play a large role (perhaps even the most important one), not everything can be attributed to software alone. Devices with older hardware like the 2013 Nexus 7 still have a latency of 55 ms, compared to the 15 ms on the Nexus 9 — and yet both are running Android 6.0. On the other hand, the Note 5 is roughly on par with the Nexus 5, even though the former runs Lollipop and the latter runs Marshmallow.

Conclusion

So what can we conclude from this lengthy analysis? In short, two things. First, if we allow ourselves to extrapolate on recent improvements, Android is likely just about to hit the ideal goal of 10 ms latency, which is the standard for professional audio equipment. Second, unless we start developing headphones for bats, there's no point in trying to reduce latency beyond 5 ms, or 2 ms at the very extreme.

Basically, if all goes well, the issue of audio latency will simply not exist on Android by this time next year.

This article was updated to clarify that, relating to the thresholds of delay perception among various instruments, the confidence level of 80% was not for the point value of a given number of milliseconds but for the interval between that number and positive infinity.

Source: Android.com, Superpowered

Image credit: Superpowered