When a robot becomes a teammate: a Niantic Spatial Super Bowl story
The 2026 Super Bowl, hosted in San Francisco, gave us a two-week window to run an experiment. We’ve spent the past six years rendering the whimsical world of Peridot over a display of the real world. Could we take this education and apply it to the rapidly emerging space of physical AI? The Bay Area Host Committee gave us the opportunity to find out.
The Bay Area Host Committee hosted the BAHC Super Bowl LX Innovation Summit presented by YouTube to celebrate the region’s “unmatched spirit of innovation,” and they reached out to Niantic Spatial to bring an innovative interactive experience. The team that brought us Peridot Mobile, Hello Dot, Peridot Beyond, and Project Jade rallied around the concept but initially scoffed at the two-week turnaround.
Could we build a football-themed interactive experience in two weeks?
The team put their heads together and looked at our prototyping buddy, Tron. What if we tracked Tron like we do Peridot on Project Jade? Tron has a camera, so we could use the Niantic Spatial Visual Positioning System to determine the robot’s location. What if we used the multiplayer technology we built for Project Jade, but instead of people wearing Snap Spectacles, we networked people’s phones…hmm.
So, we went to the Innovation Lab, used Scaniverse to capture the space, and prototyped localizing moving players relative to a moving robot—and it worked!
By using our team’s learnings in augmented reality, we were able to directly apply that expertise to the physical world. It was mind-blowing to see something that had only existed virtually playing out in the physical world.
One week into the experiment, we greenlit the project, and then the fun began. We attached football goalposts to Tron’s head, which were only viewable through a mobile phone. We localized the position of Tron using Niantic Spatial VPS and were able to accurately place the uprights wherever Tron moved. We gave people the ability to “kick” footballs by tapping on a phone screen and to throw footballs to other players. While it was a simple interaction, the reaction from people at the Innovation Summit was awe.
Seeing a robot tooling around is enough to turn heads. When we handed people a phone and pointed at Tron, they immediately started tapping—then laughing—then trying to hit each other with footballs. All of the attendees were delighted, and then asked the questions—how does this work? How long did it take you to build? Is this faked?
We faked nothing. It was all off-the-shelf Niantic Spatial technology, easily integrated across a robot and phones by a team who had never built a physical AI experience…all in two weeks.
We learned that physical AI is here, and Niantic Spatial is ahead of the curve because we’ve been innovating in the digital/physical space for nearly a decade. We knew more than we thought we did, and we accomplished more than we imagined. We’re excited to continue our adventure with robotics use cases this year. Stay tuned for what’s next.
–Alicia Berry, Executive Producer at Niantic Spatial