Filter
Hello, Dot

Niantic 『ペリドット』 ミックスドリアリティで遊べる新しい “Hello, Dot” を無料で世界同時リリース

News

あなたのそばにいるように

Hello, Dot

『Peridot』で複合現実の魔法を体験しよう!

Partnerships

Apple Vision Pro向けアプリSunnyTuneにPeridotが登場します!

Peridot Mobile

探検ウィーク期間中に葉っぱの中から食べ物を採集しましょう!

Peridot Mobile

2月の「巣の掘り出し物」

Peridot Mobile

2024年1月の『Peridot』特徴ウィーク

Peridot Mobile

生成AIで『Peridot』にさらなる楽しみを

Peridot Mobile

2023年11月の『Peridot』ドットタイプウィーク

Peridot Mobile

カゲから「のらドット」を保護しよう!

Peridot Mobile

たまごをかえすのをさらに楽しく!

Peridot Mobile

日本で最初のペリドット公式ファンミートアップを開催します!

Peridot Mobile

たまごのかえし方が簡素化され、より楽しい体験に!

Peridot Mobile

ゲーム内で巣の入手方法について

Peridot Mobile

2023年7月『Peridot』の「コミュニティ・デイ」

Peridot Mobile

Celebrate Pride Month with Peridot

Peridot Mobile

June 2023 Peridot Community Day Weekend

When a robot becomes a teammate: a Niantic Spatial Super Bowl story

The 2026 Super Bowl, hosted in San Francisco, gave us a two-week window to run an experiment. We’ve spent the past six years rendering the whimsical world of Peridot over a display of the real world. Could we take this education and apply it to the rapidly emerging space of physical AI? The Bay Area Host Committee gave us the opportunity to find out.

The Bay Area Host Committee hosted the BAHC Super Bowl LX Innovation Summit presented by YouTube to celebrate the region’s “unmatched spirit of innovation,” and they reached out to Niantic Spatial to bring an innovative interactive experience. The team that brought us Peridot Mobile, Hello Dot, Peridot Beyond, and Project Jade rallied around the concept but initially scoffed at the two-week turnaround.

Could we build a football-themed interactive experience in two weeks?

The team put their heads together and looked at our prototyping buddy, Tron. What if we tracked Tron like we do Peridot on Project Jade? Tron has a camera, so we could use the Niantic Spatial Visual Positioning System to determine the robot’s location. What if we used the multiplayer technology we built for Project Jade, but instead of people wearing Snap Spectacles, we networked people’s phones…hmm.

So, we went to the Innovation Lab, used Scaniverse to capture the space, and prototyped localizing moving players relative to a moving robot—and it worked!

By using our team’s learnings in augmented reality, we were able to directly apply that expertise to the physical world. It was mind-blowing to see something that had only existed virtually playing out in the physical world.

One week into the experiment, we greenlit the project, and then the fun began. We attached football goalposts to Tron’s head, which were only viewable through a mobile phone. We localized the position of Tron using Niantic Spatial VPS and were able to accurately place the uprights wherever Tron moved. We gave people the ability to “kick” footballs by tapping on a phone screen and to throw footballs to other players. While it was a simple interaction, the reaction from people at the Innovation Summit was awe.

Seeing a robot tooling around is enough to turn heads. When we handed people a phone and pointed at Tron, they immediately started tapping—then laughing—then trying to hit each other with footballs. All of the attendees were delighted, and then asked the questions—how does this work? How long did it take you to build? Is this faked?

We faked nothing. It was all off-the-shelf Niantic Spatial technology, easily integrated across a robot and phones by a team who had never built a physical AI experience…all in two weeks.

We learned that physical AI is here, and Niantic Spatial is ahead of the curve because we’ve been innovating in the digital/physical space for nearly a decade. We knew more than we thought we did, and we accomplished more than we imagined. We’re excited to continue our adventure with robotics use cases this year. Stay tuned for what’s next.

–Alicia Berry, Executive Producer at Niantic Spatial