June 25, 2020 – Google has announced today that its Depth API is now available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.
Since the launch of ARCore, Google’s developer platform for building augmented reality (AR) experiences, the company has been focused on providing APIs that help developers seamlessly blend the digital and physical worlds.
At the end of last year, Google announced a preview of the ARCore Depth API, which uses the company’s depth-from-motion algorithms to generate a depth map with a single RGB camera. Since then, Google has been working with select collaborators to explore how depth can be used across a range of use cases to enhance AR realism. With today’s announcement, it means that users of compatible devices are able to enjoy AR experiences that are enhanced thanks to the API’s ability to generate a depth map without specialized hardware (e.g. a LiDAR scanner). This unlocks realism enhancing AR capabilities, such as occlusion.
As was highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in a space, creating a more realistic AR experience.
As a result, the release of the ARCore Depth API will help to unlock more ways to increase realism and will enable new interaction types, according to Google. The ARCore Depth Lab spurred more ideas on how depth can be used, including realistics physics, surface interactions, environmental traversal, and more. Developers can now work on new ideas for how depth can be used as part of AR experiences, including realistics physics, surface interactions, environmental traversal, and build on these ideas through the open sourced GitHub project.
The designers and engineers at Snap Inc. have integrated several of these ideas into a set of Snapchat Lenses including a Dancing Hotdog and a new Android exclusive Undersea World Lens. Snapchat Lens Creators can now download an ARCore Depth API template to create depth-based experiences for compatible Android devices.
Sam Hare, Research Engineering Manager at Snap Inc, commented: “We’re beginning to understand what kinds of depth capabilities are exciting for developers to build with. This single integration point streamlines and simplifies the development process and enables Lens Studio developers to easily take advantage of advanced depth capabilities.”
Another example that combines occlusion with other depth capabilities is the Lines of Play app, an Android experiment from the Google Creative Lab. The app lets users create domino art in AR, and uses depth information to showcase both occlusion and collisions.
In addition to gaming and self-expression, depth can also be used to unlock new utility use cases – one example being the TeamViewer Pilot app, a remote assistance solution that enables AR annotations on video calls. The app uses depth to better understand the environment so that experts around the world can more precisely apply real time 3D AR annotations for remote support and maintenance.
Google has stated that users will be able to try more depth-enabled AR experiences later this year, such as SKATRIX by Reality Crisis and SPLASHAAR by ForwARdgames, that use surface interactions and environmental traversal as they make rich use of the environments surrounding the user.
While depth sensors, such as time-of-flight (ToF) sensors, are not required for the Depth API to work, having them will further improve the quality of experiences. Commenting on the future that the Depth API and ToF unlocks, Dr. Soo Wan Kim, Camera Technical Product Manager at Samsung, said: “Depth will enrich user’s AR experience in many perspectives. It will reduce scanning time, and can detect planes fast, even low textured planes. These will bring seamless experiences to users who will be able to use AR apps more easily and frequently.” In the coming months, Samsung will update its Quick Measure app to use the ARCore Depth API on the Galaxy Note10+ and Galaxy S20 Ultra, according to Google.
The ARCore Depth API will be rolling out today, with the SDK and updates to Google’s developer site coming later.
Image / video credit: Google