Google's new AI assistant wants to video chat through your phone — and your glasses

The prototype from Google's "Project Astra" was announced during its I/O developer conference

We may earn a commission from links on this page.
Google DeepMind CEO Demis Hassabis at Google I/O 2024.
Google DeepMind CEO Demis Hassabis at Google I/O 2024.
Photo: Google
In This Story

AI assistants are picking up more senses. On Monday, OpenAI showed off a new ChatGPT model that promises to see, hear, and speak through smartphones, among other new abilities. Now Google is announcing a rival assistant with similar capabilities.

At the company’s I/O developer conference Tuesday, DeepMind CEO Demis Hassabis debuted a prototype of Google’s new expert AI assistant that can see through a user’s phone and other objects like smart glasses. The assistant “build[s] on” Gemini, Google’s existing chatbot, the company says, and some of its capabilities are coming to the Gemini app and web experience later this year.

Advertisement

The development is part of Google DeepMind’s Project Astra, which aims to create “a universal AI agent” for users’ everyday lives. “It’s easy to envisage a future where you can have an expert assistant by your side, through your phone, or new exciting form factors like glasses,” Hassabis told a crowd of a few thousand developers in Mountain View, California.

Advertisement

A demo video shows a person speaking with an AI agent through their phone while walking through an office. Through their camera, they show the AI assistant a container of crayons as if they were talking through FaceTime and ask it to make a “creative alliteration.”

Advertisement

“Creative crayons color cheerfully,” it said. “They certainly craft colorful creations.” The person continues interacting with the AI bot on their walk, then realize they forgot their glasses and ask for help finding them. “They’re on the desk near a red apple,” the bot responds.

When the user puts on those glasses, the AI assistant can look through them, too — and identifies an illustration representing Shrodinger’s cat on a whiteboard.

Project Astra: Our vision for the future of AI assistants

It’s unclear if those glasses are a new product Google plans to launch. The augmented reality glasses on display in the demo didn’t look like Google Glass, the company’s existing smart glasses, nor did they resemble typical, bulkier headsets.

Advertisement

“An agent like this has to understand and respond to our complex and dynamic world just like we do,” said Hassabis at the conference. “It would need to take in and remember what it sees so it can understand context and take action. And it would have to be proactive, teachable, and personal. So you can talk to it naturally without lag or delay.”

That’s what Project Astra is aiming to do, he said, and it’s making “great strides.”

Advertisement

While Google’s prototype AI assistant is available to demo for attendees of its Google I/O conference, it will probably be a while before the tech makes its way into the hands of everyday consumers.