Originally published on 22 September 2025
I’ve been using the Even Realities G1 as my daily driver. That experience has shown me that AI glasses aren’t some futuristic idea anymore.
They’re already practical. They feel like a smartwatch for your eyes: glance, get the info you need, and move on.
For me, 2025 feels like the tipping point.
The glasses hitting the market aren’t yet full AR headsets with room-scale mapping, but they’re lightweight and already useful in daily life.
Some of the biggest names are still about cameras and audio, not HUDs.
Ray-Ban Meta (Gen 2) and Oakley Meta Vanguard: These glasses come with features for video, photo, and calls, but have nothing projected in front of your eyes.
HTC Vive Eagle: Limited to Taiwan-only for now, the Eagle is light and stylish, with a 12MP camera and translation built-in, but is still audio-first.
Mentra Live: Camera glasses with no HUD, but paired with an open-source operating system (MentraOS) that could become the Android of smart glasses.
This is the category that feels different. Having text or graphics in your vision changes everything.
Even Realities G1: These are my daily driver. Binocular green HUD and no cameras. They are simple, private, and surprisingly powerful for captions, translation, and teleprompters.
Meta Ray-Ban Display: The first major consumer push with a display. The outdoor brightness boasts 5,000 nits and they come with an EMG wristband to detect micro gestures. The limitations are the monocular HUD and limited 20-degree field of view.
Brilliant Labs Frame: Experimental and open-source, the Frame is more dev kit than finished product, but is important for experimentation.
RayNeo X3 Pro: China-only for now, they have a binocular full-color HUD with strong AI integrations.
XREAL Air 2 Ultra: With full-color and 6DoF, they edge into AR headset territory. They rely on a cable to connect to a phone and are heavier than other glasses, but are a preview of what’s possible.
Snap Spectacles: The closest thing to a full package in AR today, with immersive visuals and creator-first tools.
The split is clear:
The way I see it:
Using the Even G1 every day has shown me what AI glasses are good for right now.
Subtitles at a restaurant, live translation when traveling, or a teleprompter in a meeting. These are small, simple interactions that don’t need a full AR headset.
That’s why 2026 feels like the breakout year. AI glasses will stop being prototypes and start being worn daily, the way smartwatches quietly became essential.
Whether Apple or Meta hit in 2028 or later doesn’t matter as much.
The real shift is already starting: AI glasses are becoming part of everyday life, and they’re the bridge to full AR.