Google announces updates to ARCore’s Augmented Faces and Cloud Anchors APIs

In Augmented Reality News

September 12, 2019 – Google has today announced new updates to ARCore’s Augmented Faces and Cloud Anchors APIs, to help enable more shared cross-platform augmented reality experiences. The company launched its ARCore developer platform for creating AR experiences two years ago, and since then, developers have created thousands of AR apps across Android and iOS.

Augmented Faces on iOS
Earlier this year, Google announced its Augmented Faces API, which offers a high-quality, 468-point 3D mesh that lets users attach fun effects to their faces — all without a depth sensor on their smartphone. With the addition of iOS support rolling out today, developers can now create effects for more than a billion users. Google has also made the creation process easier for both iOS and Android developers with a new face effects template.

Improvements to Cloud Anchors
Google introduced the Cloud Anchors API last year, which enables developers to create shared AR experiences across Android and iOS. Cloud Anchors let devices create a 3D feature map from visual data onto which anchors can be placed. The anchors are hosted in the cloud so multiple people can use them to enable shared real world experiences. Cloud Anchors power a wide variety of cross-platform apps.

In this latest ARCore update, Google has made improvements to the Cloud Anchors API that make hosting and resolving anchors more efficient and robust. This is due to improved anchor creation and visual processing in the cloud. Now, when creating an anchor, more angles across larger areas in the scene can be captured for an improved 3D feature map. Once the map is created, the visual data used to create the map is deleted and only anchor IDs are shared with other devices to be resolved. Moreover, multiple anchors in the scene can now be resolved simultaneously, reducing the time needed to start a shared AR experience.

These updates to Cloud Anchors are available for developers today.

Persistent Cloud Anchors
Google has stated that it is taking steps to expand the scale and timeline of shared AR experiences with persistent Cloud Anchors as it looks to the future, effectively enabling a ‘save button’ for AR. This means that digital information overlaid on top of the real world can be experienced at anytime. For example, redesigning a home throughout the year, leaving AR notes for friends around an amusement park, or hiding AR objects at specific places around the world to be discovered by others.

Persistent Cloud Anchors are powering Mark AR, a social app being developed by Sybo and iDreamSky that lets people create, discover, and share their AR art with friends and followers in real-world locations. With persistent Cloud Anchors, users can continuously return back to their pieces as they create and collaborate over time.

In a company blog post, Google stated that “Reliably anchoring AR content for every use case—regardless of surface, distance, and time—pushes the limits of computation and computer vision because the real world is diverse and always changing. By enabling a ‘save button’ for AR, we’re taking an important step toward bridging the digital and physical worlds to expand the ways AR can be useful in our day-to-day lives.”

Google is looking for more developers to help explore and test persistent Cloud Anchors in real world apps at scale, before it makes the feature broadly available. Interested developers can apply here.

Video credit: Google/YouTube

About the author

Sam Sprigg

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he covers news articles on both the AR and VR industries. He also has an interest in human augmentation technology as a whole, and does not just limit his learning specifically to the visual experience side of things.