Imagine youāre walking your dog. It interacts with the world around youāsniffing some things, relieving itself on others. You walk down the Embarcadero in San Francisco on a bright sunny day, and you see the Ferry Building in the distance as you look out into the bay. Your dog turns to you, looks you in the eye, and says, āDid you know this waterfront was blocked by piers and a freeway for 100 years?ā
OK now imagine your dog looks like an alien and only you can see it. Thatās the vision for a new capability created for the Niantic Labs AR experience Peridot.
Niantic, also the developer of the worldwide AR behemoth PokĆ©mon Go, hopes to build out its vision of extending the metaverse into the real world by giving people the means to augment the space around them with digital artifacts. Peridot is a mobile game that lets users customize and interact with their own little Dotsādog-sized digital companions that appear on your phoneās screen and can look like theyāre interacting with the world objects in the view of your camera lens. Theyāre very cute, and yes, they look a lot like PokĆ©mon. Now, they can talk.
Peridot started as a mobile game in 2022, then got infused with generative AI features. The game has since moved into the hands of Niantic Spatial, a startup created in April that aims to turn geospatial data into an accessible playground for its AR ambitions. Now called Peridot Beyond, it has been enabled in Snapās Spectacles.
Hume AI, a startup running a large language model that aims to make chatbots seem more empathetic, is now partnering with Niantic Spatial to bring a voice to the Dots on Snapās Spectacles. The move was initially announced in September, but now itās ready for the public and will be demonstrated at Snapās Lens Fest developer event this week.
