Thursday, March 13, 2025
Google search engine
HomeGamingUsPokemon Go Has a New Owner, While Niantic Is Evolving Its Maps...

Pokemon Go Has a New Owner, While Niantic Is Evolving Its Maps to Fold in AI and AR


Your next game of Pokemon Go could be changing. The game is getting a new corporate boss. Meanwhile, previous parent company Niantic has other plans: a future that’s less about gaming and more about maps that are AI-generated by those games. I’ve already seen pieces of that future, and AR glasses may just be one part of where it all plays out.

However, it may still involve some games, too. Niantic retains control of Ingress and Peridot, two of its most augmented reality-focused, location and map-connected games. And those games, along with Niantic’s further pivot to becoming known as Niantic Spatial, could be a sign of what’s happening next: tech companies exploring how AI can understand the world around us instead of repeating broken attempts with AI glasses like Meta Ray-Bans.

Niantic didn’t have any additional comment on the news. Still, I recently demoed the Quest-native Scaniverse app, acquired in 2021. The app focuses on discovering existing 3D scans of real-world locations, and similar to other existing 3D-scanning apps like Polycam, Scaniverse focuses on creating and viewing location data. But, the future of Niantic’s plans may focus even more on training AI on this data, which Pokemon Go players have been adding over time. If the future of AR and always-on wearable AI is really going to work, we will need a better sense and control over how and when we share data.

From Niantic’s standpoint, what happens next is focusing on that scanned map of the world as a data set for AI to feed off. 

“We’re in the midst of seismic changes in technology, with AI evolving rapidly,” wrote Niantic CEO John Hanke in a LinkedIn post. “Existing maps were built for people to read and navigate, but now there is a need for a new kind of map that makes the world intelligible for machines, for everything from smart glasses to humanoid robots, so they can understand and navigate the physical world. Today’s LLMs represent the first step towards a future where a variety of expert models collaborate to reason and understand complex problems, and many of those problems will require deep and accurate knowledge of the physical world. Niantic is building the models that will help AI move beyond the screen and into the real world.”

Niantic is already focused on scanning the real world for AR and VR experiences on phones and in headsets like the Meta Quest, mainly to show off how interesting these uncanny 3D scans can be when you step into them again. Companies like Polycam, whose impressive 3D scans of real-world environments I also experienced on Vision Pro, have been exploring moves to use these scans for more business-focused purposes. Niantic is making those same claims with their scans. The tech behind these scans is known as Gaussian splats, which are generated using multiple photos and depth-sensing data. 

The idea of AI beginning to understand and navigate the real world by studying these advanced scans is a whole new level and not a surprising one. AR and VR have already been training grounds for AI and robotics, and the data sets we collect by wearing smart glasses and other world-mapping wearables will inevitably be the things generative AI starts to study next. It could very well be a massive part of what companies like Meta, Google and, eventually, Apple use as the underpinning to make real-world-wearable AR glasses actually work and recognize things around you with AI.

Companies like Meta and Google are already using sensor-studded smart glasses prototypes to explore building real-time world-aware assistance systems with AI. Project Aria and Project Astra are building blocks for more continuously aware smart glasses to come, but they still lack a deeper, helpful awareness of the world. Training off 3D-scanned map data could be a huge part of the next leap for what will eventually live in products like Meta’s prototype Orion AR glasses and Google’s Android XR devices.

Niantic isn’t the only company focusing on scanning the real world into future maps: Google, Apple, Snap and many others, including Polycam, are already doing it. Niantic’s current pitch seems like it’s pivoting more to AI, but AI and AR seem destined to blend. I still don’t understand how everyone’s differently collected map data and personal data are clearly differentiated. And as Niantic moves away from games to focus on AR’s future, that is something that sparks curiosity — and concern.





Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments