Apple unveils new APIs in RealityKit 2, allowing developers to create more complex Augmented Reality experiences


In Augmented Reality News 

June 7, 2021 – Apple has today unveiled new tools and technologies designed to help developers create more engaging app experiences and make it easier to build apps: with its new Xcode Cloud; improvements to the App Store; updates to its Swift programming language; New APIs and tools in iOS, iPadOS, and macOS for game developers; and finally, new APIs in RealityKit 2 – Apple’s rendering, animation, audio, and physics engine built from the ground up for augmented reality (AR).

Apple’s ARKit augmented reality framework powers over 1 billion AR-enabled devices and allows developers to create AR experiences in conjunction with RealityKit. With today’s announcement, the company stated that RealityKit 2 introduces Object Capture, an API on macOS Monterey that enables developers to create high-quality, photo-realistic 3D models of real-world objects in minutes by taking photos shot on iPhone, iPad, or DSLR and transforming them into 3D models optimized for AR.

These models can be viewed in AR Quick Look or added to AR scenes in Reality Composer or Xcode, making it easier to build AR apps, according to the company. Apple stated that developers are using Object Capture to unlock entirely new ways of creating 3D content within some of the leading 3D content creation apps, such as Cinema4D and Unity MARS.

Apple added that with new APIs in RealityKit 2, developers can also create more realistic and complex AR experiences with greater visual, audio, and animation control, including custom render passes and dynamic shaders.

Susan Prescott, Apple’s vice president of Worldwide Developer Relations, commented: “This is a massive step forward for 3D content creation. What used to be the most difficult and expensive part of building AR experiences and 3D scenes, is now available to all developers in macOS Monterey.”

In addition to Apple’s newly announced APIs for augmented reality creation, the company’s Xcode Cloud brings together the multiple tasks and tools required to build, test, and deliver apps using cloud services. With In-App Events and Custom Product Pages, the App Store now provides new ways for developers to promote their apps and connect with users. Finally, the company has added concurrency support to its Swift programming language. 

“We’re thrilled to provide our developer community with powerful new tools and technologies to help create even more compelling and higher-quality apps, while engaging with their users in all new ways through the App Store,” added Prescott. “With the robust set of tools included in Xcode Cloud, continuing innovation in the Swift programming language, a wide range of new APIs, and even more ways to reach users — Apple’s platforms have never been stronger.”

For a full breakdown of today’s announcements, visit the Apple newsroom release.

Video credit: Apple

About the author

Sam Sprigg

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he covers news articles on both the AR and VR industries. He also has an interest in human augmentation technology as a whole, and does not just limit his learning specifically to the visual experience side of things.