A New Sense of Direction: How Niantic Spatial's Location-Aware AI Companion is Evolving
Today at Snap's Lens Fest, our AI companion, Peridot, is evolving. In an experience tailored for the event, it’s moving beyond just understanding its surroundings to actively navigating them alongside you. Attendees can ask Dot about interesting sights and featured experiences, and it will physically guide them there. It will share context along the way, conversing through natural, emotionally aware and conversational AI powered by Hume AI. With similar demos now active at our major Niantic Spatial offices in San Francisco, Seattle, London, and Tokyo, you too could have a go.
This demo marks the next waypoint on a journey we began at AWE with Project Jade. Our first step was to give our companion a suite of senses: sight via Snap’s AR glasses, Spectacles, a deep understanding of the world via our Visual Positioning System (VPS), and an AI brain for memory and knowledge. This allowed Dot to hold an intelligent conversation about a single location.
With today’s update, we add the final piece. By giving it the ability to navigate, we’re evolving our spatially-aware companion into one that can embark on a journey with you.
This leap required us to solve one of the most fundamental challenges of AR navigation: teaching a digital companion the invisible rules of a physical space. How do you teach it to navigate a crowd or differentiate a path from a physical object? Our solution is a proprietary internal tool that ingests 2D floorplans to algorithmically generate a comprehensive pathfinding map, that can further be modified. This map teaches our companion all the possible routes it can take, ensuring it navigates spaces intelligently without walking through walls or into restricted areas, while our computer vision algorithms can provide the real-time semantic understanding needed to avoid obstacles and hazards.
But a great companion needs more than just a sense of direction; it needs personality. To make Dot even more delightful and lifelike than in its first iteration, we’ve collaborated with our partners at Hume AI to integrate EVI, their advanced conversational AI, that delivers sub-second, emotionally adaptive dialogue in real time, enabling Dot to respond naturally like a true companion. This gives Dot a more nuanced and empathetic voice, transforming the experience from simply receiving information into something that feels more like exploring with a curious and trusted friend.
Imagine a future where your companion in a new city isn’t a blue arrow on a screen, but a character who walks alongside you, sharing stories about architecture and pointing out hidden gems you’d otherwise miss. Imagine walking through a busy airport terminal, where your companion draws your attention to a massive hanging art installation, explaining the artist's vision and how it was constructed.
Niantic Spatial is working on a Large Geospatial Model that will allow us to deliver these experiences to scale: more coverage, higher fidelity, lower cost, and greater repeatability. Project Jade and its underlying technology is an exciting example of what’s possible, enabling a new wave of experiences, from advanced AR companions to intelligent robotics, that help us find more connection, more understanding, and more joy in the world around us.
–Asim Ahmed, Head of Product Marketing at Niantic Spatial