
When Ray-Ban Meta smart glasses launched in 2023, they were a stylish, surprisingly powerful mix of camera glasses and a wearable assistant. Fast-forward to 2025, and these glasses are inching toward something much bigger: true augmented reality. With Meta now preparing to add tiny displays inside the lenses themselves, we’re seeing the start of a fundamental shift—from passive smart glasses to immersive, everyday computing devices.
Let’s break down how we got here, what’s new in 2025, and what Meta’s long-term vision looks like.
2023: Smart Glasses Become a Real Assistant
In September 2023, Meta and Ray-Ban launched the second generation of their smart glasses—the Ray-Ban Meta series. This wasn’t just an upgrade in style or battery life. It marked the true beginning of Meta’s vision for smart eyewear:
- A Better Camera: The new 12MP camera allowed users to take higher-resolution photos and record videos up to 60 seconds, compared to the 30-second limit on the first model.
- Stronger Audio System: With five microphones, the glasses offered clearer calls and more accurate voice pickup, even in noisy environments.
- Meta AI: This was the game changer. The built-in assistant could respond to voice commands, answer basic questions, send messages, and control music or calls—all triggered by a simple “Hey Meta.”
- Customization: Over 150 frame and lens combinations gave users more options to personalize their look while staying connected.
By the end of 2023, creators, travelers, and everyday users began realizing that these weren’t just camera glasses—they were wearable assistants.
2024: The Year Everything Got Smarter
In 2024, the glasses received the biggest leap forward yet: they started to understand what you were seeing.
April 2024: Meta AI with Vision
Meta rolled out an update that gave the glasses visual awareness. With “Meta AI with Vision,” you could:
- Ask, “What am I looking at?” and get information about buildings, objects, animals, and more.
- Get real-time translations by looking at a sign or menu in another language.
- Get memory support, like asking “Where did I park?” and having the glasses recall what they saw.
- Use contextual assistance—the AI could understand what you were doing or looking at and offer help in the moment.
This wasn’t just an assistant anymore. It was starting to behave like a basic visual AI companion.
September 2024: Deeper AI and App Integration
Later that year, Meta delivered even more powerful updates:
- Conversational Awareness: You could talk to the glasses naturally, without needing to say “Hey Meta” repeatedly. It remembered the context of your questions.
- Video-based AI Help: You could record or live-stream what you were seeing and ask Meta AI questions about it—like a walking tour with a genius guide.
- Voice Messaging & Hands-Free Communication: You could send messages through WhatsApp and Messenger using just your voice.
- Media Control: Integrations with Spotify, Amazon Music, Audible, and others made it easy to play music or audiobooks, identify songs, and even get music suggestions—all through voice.
- Third-Party Expansion: Meta began collaborating with more developers to build new apps and tools directly into the glasses experience.
2024 showed that Meta was no longer just improving hardware—it was turning the glasses into a complete AI ecosystem.
2025: In-Lens Displays and the First Step into AR
This year, Meta is preparing to release the biggest hardware upgrade yet: micro-displays built right into the lenses.
Here’s what we know:
- What They’ll Do: These tiny displays will show basic visuals like incoming messages, navigation prompts, or reminders, all directly in your line of sight—without needing to check your phone.
- How They’ll Work: Meta is working with Luxottica to embed ultra-thin display tech into the lenses without making the glasses bulky or uncomfortable. The goal is to retain the Ray-Ban style while adding next-gen utility.
- Who Gets Them: These features are not just a software update. They’ll only be available in a new, high-end model of the Ray-Ban Meta smart glasses, expected to release in late 2025. The price is rumored to be around $1,000, a major jump from the $300 base model.
- Not Quite Full AR (Yet): These in-lens displays are not the same as full AR (which would require 3D mapping, spatial anchors, and interaction). But they are the foundation—a way to build up everyday utility before Meta introduces something like the long-rumored “Orion” AR glasses.
- Meta’s Strategy: By gradually integrating AI, visual context, and now micro-displays, Meta is onboarding users step-by-step into a future where glasses don’t just record what you see—they enhance it in real-time.
Ray-Ban Meta smart glasses have gone from stylish camera gadgets to context-aware voice assistants to early-stage AR devices—all within two years. With in-lens displays set to arrive by the end of 2025, Meta is taking its boldest step yet toward glasses that don’t just let you capture the moment, but live more intelligently inside it.
And this is just the beginning.
If you’d like to check out the current Ray Ban Meta glasses, you can purchase them here (affiliate)!