Go back

Software Development

Project Astra by Google DeepMind

Introduction to Project Astra 

Google DeepMind’s Project Astra is an experimental AI assistant that can see, hear, and talk—right on your phone or smart glasses. It was first shown at Google I/O in May 2024 and is still in testing with a group of trusted users

Seeing and Understanding the World

Astra is multimodal, meaning it processes voice, text, video, and images all at once. For example, you can point your camera at an object, show it a code snippet on your screen, or ask it to find your lost glasses—and it can respond in real time .

Real-Time Memory and Context

It keeps up to 10 minutes of in-session memory, capturing recent visual and audio context. Astra can also remember past conversations to improve personalization. This helps it follow ongoing tasks with better understanding

Acting Without Being Asked

One strong new feature is proactivity. Astra can watch what you’re doing, understand the context, and jump in—without being asked. During tests, it corrected homework mistakes, reminded users about intermittent fasting, and adjusted phone settings on its own 

Tapping Into Google’s Ecosystem

Astra connects to Google tools like Search, Lens, and Maps. It can check your calendar, open apps, or adjust settings—all based on what it observes. It integrates multimodal reasoning, low-latency audio, and language switching in over 20 languages 

Testing on Phones and Smart Glasses

So far, Astra is available on Android phones and prototype smart glasses. Testing with glasses aims to create a hands-free, immersive experience—letting Astra “see” the world in real time 

Challenges Ahead

Despite the exciting progress, Astra is still a prototype. It faces technical challenges like latency, privacy, and finding the right moment to assist without interrupting. Developers are carefully refining its “reading the room” abilities so it can stay helpful without overstepping 

Why It Matters

Project Astra points toward the future of universal AI assistants that understand their surroundings, remember context, and act with intent. Google aims to gradually refine these breakthroughs into real products—starting with the Gemini app and other Google features 

What’s Next?

In the coming months and years, expect Astra-inspired features in products like Gemini Live and perhaps in future smart glasses. The key challenge is delivering truly helpful and trustworthy AI that blends smoothly into everyday life.