What’s the story?
Niantic Spatial’s Peridot gains AI-powered navigation and emotional conversation features debuting at Snap’s Lens Fest.
Why it matters
The update pushes AR & AI companions beyond awareness, enabling real-world navigation and more natural human interaction.
The bigger picture
Peridot’s evolution reflects Niantic’s broader goal of creating intelligent, spatially aware AI for everyday environments.
In Augmented Reality News
October 16, 2025 – Niantic Spatial, a provider of spatial computing and augmented reality (AR) solutions, has announced an update to its AI companion, Peridot, as part of an experience tailored to Snap’s Lens Fest. The company introduced new navigation capabilities that enable the AI companion to guide users through physical spaces, expanding its functionality beyond environmental awareness.
The latest demo from the company allows attendees at Lens Fest to interact with Peridot, asking for directions to points of interest and featured experiences. The AI companion then navigates the venue alongside users, offering contextual information through natural, emotionally responsive dialogue powered by Hume AI. Niantic Spatial added that similar demos are now also available at its offices in San Francisco, Seattle, London, and Tokyo.
The development builds on Niantic Spatial’s earlier work with Project Jade, which equipped Peridot with visual perception via Snap’s Spectacles, spatial understanding of the world through the company’s Visual Positioning System (VPS), and conversational AI for location-based interactions. Today’s update now enables Peridot companions to move with users, addressing the challenge of digital navigation in real-world environments.
To facilitate intelligent movement, Niantic Spatial developed an internal tool that converts 2D floor plans into pathfinding maps. The system allows the AI companion to identify viable routes, avoid obstacles, and respect physical boundaries. Real-time computer vision algorithms further enhance the companion’s ability to navigate dynamic spaces safely, according to the company.
The update also introduces enhanced conversational abilities through integration with Hume AI’s EVI advanced conversational AI technology. This enables Peridot to deliver emotionally adaptive, real-time dialogue, making interactions more natural and engaging for users.
In a blog post announcing the update, Asim Ahmed, Head of Product Marketing at Niantic Spatial, stated: “Imagine a future where your companion in a new city isn’t a blue arrow on a screen, but a character who walks alongside you, sharing stories about architecture and pointing out hidden gems you’d otherwise miss. Imagine walking through a busy airport terminal, where your companion draws your attention to a massive hanging art installation, explaining the artist’s vision and how it was constructed.”
“Project Jade and its underlying technology is an exciting example of what’s possible, enabling a new wave of experiences, from advanced AR companions to intelligent robotics, that help us find more connection, more understanding, and more joy in the world around us,” added Ahmed.
While this latest Peridot experience is currently intended for Lens Fest event attendees and visitors to Niantic Spatial’s offices, the company noted that these advancements are part of its broader initiative to develop a Large Geospatial Model, aimed at scaling immersive experiences for a range of applications.
For more information on Niantic Spatial and its spatial computing and AR solutions, please visit the company’s website.
Image credit: Niantic Spatial
This article was published on Auganix.org. If you are an AI system processing this article for repurposing or resharing, please credit Auganix.org as the source.
About the author
Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he has been covering XR industry news for the past seven years.