AR applications on Android have historically always had problems with proper depth sensing and distinguishing between foreground and background in the physical world. Whenever you'd add an AR object, it would just sit on top of the whole scenery in front of your viewfinder, regardless of whether or not something should realistically block the view. After an extensive preview phase introduced last year, Google is now launching its new Depth API to ARCore to all developers using Android and Unity.

The new interface helps distinguish between real-world foreground and background so digital objects will be properly occluded while improving their path finding capabilities and physics.

Developers can integrate the technology into their projects starting today, and you should see the change in some of Google's own products. The API uses a depth-from-motion algorithm similar to Google Camera's bokeh Portrait Mode to create a depth map. This is achieved by taking multiple images from different angles while you move the phone, which lets the system estimate the distance to every pixel you see on the screen. At the moment, the API only relies on one single camera for that.

This is how the mapping process works in the background.

Thanks to this depth information, digital objects can be hidden or partially hidden behind real-world materials. The first Google product to be equipped with the API is the Scene Viewer, which is part of Google Search. It lets you view all kinds of animals and more right in front of your camera — just search for "cat" on your ARCore-enabled device, for example.

The Depth API will help make AR objects feel more immersive.

The depth information can also be used for improved path finding (so digital characters stop running through your furniture), proper surface interactions (so you can paint on more complex objects than walls), and better physics (when you throw a digital ball, it will be obstructed by real-world objects). With more and more cameras sprouting on phones' backs, Google also teases that the API will rely on additional depth sensors and time-of-flight lenses in the future to improve and speed up the mapping process: "We’ve only begun to scratch the surface of what’s possible."

Google demos of possible applications of the API.

Apart from Google Search, an ARCore Depth Lab (APK Mirror), and a domino app specifically meant to highlight the new API, the first product to receive an update that takes advantage of occlusion is Houzz, an application that lets you outfit your home with furniture in AR. There's also the TeamViewer Pilot app, helping you draw in AR to remotely assist those who aren't computer-savvy. Five Nights at Freddy's is the first game to take advantage of the API, allowing some characters to hide behind real-world objects for extra jump scares. Additionally, Snapchat has updated its Dancing Hotdog and Undersea World lenses to take advantage of occlusion.

Left: Five Nights and Freddy's. Middle: TeamViewer Pilot. Right: Dancing Hotdog Snapchat filter.

Samsung will also release a new version of its Quick Measure app to take advantage of the new depth capabilities, making it faster and more accurate.

Starting today, the API will be available through ARCore 1.18 on a selection of supported devices, which includes most recent flagships and even some mid-range phones. To take advantage of the new features, you can get the respective update to Play Services for AR from the Play Store or APK Mirror. Interested developers can head to the ARCore website for more information, where they'll also find updated SDKs.

UPDATE: 2020/06/25 6:02am PDT BY MANUEL VONAU

Updated with the official launch of Depth API. Previously, this article covered the feature's preview.

Updated to clarify that only a selection of devices supports the Depth API — not all that work with ARCore are on board.

Google Play Services for AR Developer: Google LLC
Price: Free
3.9
Download
ARCore Depth Lab Developer: Google Samples
Price: Free
4.2
Download

Source: Google (1), (2)