Google releases Instant Motion Tracking, allowing users to ‘pin’ AR objects to real world moving objects in real-time

In Augmented Reality

September 1, 2020 – Google has announced the release of its ‘Instant Motion Tracking’ solution in MediaPipe, which is built upon the MediaPipe Box Tracking solution it released previously. With Instant Motion Tracking, users can now place virtual 2D and 3D content on static or moving surfaces, allowing them to seamlessly interact with the real world. Google is releasing an open source Android application to showcase its capabilities, along with a library of content. 

To those familiar with Instagram and its ‘Pin’ functionality, what this essentially means is that Google has released its own ‘Pinning’ feature, but for AR content – so users can augment the real world around them in real time, rather than having to wait for a pre-recorded video to be analyzed and objects pinned after the video has been shot. Users can now pin an AR object or GIF animation to a real-life moving object or fixed position, and as the user moves around their chosen anchor point, their selected augmented reality animation will respond to their movements accordingly.

In Google’s own words, “the Instant Motion Tracking solution provides the capability to seamlessly place virtual content on static or motion surfaces in the real world.” To achieve that, the solution provides six degrees of freedom (6DoF) tracking with relative scale in the form of rotation and translation matrices. This tracking information is then used in the rendering system to overlay virtual content on camera streams to create immersive AR experiences.

According to Google, the core concept behind Instant Motion Tracking is to decouple the camera’s translation and rotation estimation, treating them instead as independent optimization problems. This approach enables AR tracking across devices and platforms without initialization or calibration. This is done by first finding the 3D camera translation using only the visual signals from the camera. This involves estimating the target region’s apparent 2D translation and relative scale across frames. 

The device’s 3D rotation from its built-in IMU (Inertial Measurement Unit) sensor also plays a key role. By combining this translation and rotation data, the Instant Motion Tracking solution can track a target region with six degrees of freedom at relative scale. This information allows for the placement of virtual content on any system with a camera and IMU functionality, and is calibration free. For a detailed explanation with all the technical information, you can read more on Google’s original blog post, or you can refer to its paper here.

Demonstration of GIF placement in 3D

Google’s Instant Motion Tracking solution will allow developers and users to bring both 3D stickers and GIF animations into augmented reality experiences. GIFs are rendered on flat 3D billboards placed in the world, introducing fun and immersive experiences with animated content blended into the real environment.

MediaPipe Instant Motion Tracking is already helping PixelShift.AI, a startup applying cutting-edge vision technologies to facilitate video content creation, to track virtual characters seamlessly in the view-finder for a realistic experience. Building upon Instant Motion Tracking’s pose estimation, PixelShift.AI enables VTubers to create mixed reality experiences with web technologies. The product is going to be released to the broader VTuber community later this year, according to Google.

Video credit: Google

About the author

Sam Sprigg

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he covers news articles on both the AR and VR industries. He also has an interest in human augmentation technology as a whole, and does not just limit his learning specifically to the visual experience side of things.