XR

Published on Sep 22, 2025

AI Glasses Become Everyday Wear

Pedro Cruz

Follow

Co-Founder

· Published on Sep 24, 2025

Pedro Cruz

Originally published on 22 September 2025

I’ve been using the Even Realities G1 as my daily driver. That experience has shown me that AI glasses aren’t some futuristic idea anymore.

They’re already practical. They feel like a smartwatch for your eyes: glance, get the info you need, and move on.

For me, 2025 feels like the tipping point.

The glasses hitting the market aren’t yet full AR headsets with room-scale mapping, but they’re lightweight and already useful in daily life.

Glasses without displays

Some of the biggest names are still about cameras and audio, not HUDs.

Ray-BanMeta (left) and Oakley Meta Vanguard (right)

Ray-Ban Meta (Gen 2) and Oakley Meta Vanguard: These glasses come with features for video, photo, and calls, but have nothing projected in front of your eyes.

HTC Vive Eagle

HTC Vive Eagle: Limited to Taiwan-only for now, the Eagle is light and stylish, with a 12MP camera and translation built-in, but is still audio-first.

Mentra Live

Mentra Live: Camera glasses with no HUD, but paired with an open-source operating system (MentraOS) that could become the Android of smart glasses.

Glasses with displays

This is the category that feels different. Having text or graphics in your vision changes everything.

Even Realities G1

Even Realities G1: These are my daily driver. Binocular green HUD and no cameras. They are simple, private, and surprisingly powerful for captions, translation, and teleprompters.

Meta Ray-Ban Display

Meta Ray-Ban Display: The first major consumer push with a display. The outdoor brightness boasts 5,000 nits and they come with an EMG wristband to detect micro gestures. The limitations are the monocular HUD and limited 20-degree field of view.

Brilliant Labs Frame

Brilliant Labs Frame: Experimental and open-source, the Frame is more dev kit than finished product, but is important for experimentation.

RayNeo X3 Pro

RayNeo X3 Pro: China-only for now, they have a binocular full-color HUD with strong AI integrations.

XREALAir 2 Ultra

XREAL Air 2 Ultra: With full-color and 6DoF, they edge into AR headset territory. They rely on a cable to connect to a phone and are heavier than other glasses, but are a preview of what’s possible.

Snap Spectacles

Snap Spectacles: The closest thing to a full package in AR today, with immersive visuals and creator-first tools.

3DOF vs 6DOF

The split is clear:

  • 3DoF: Even G1, Meta Ray-Ban Display, Brilliant Labs. Perfect for glanceables and lightweight AI tasks.
  • 6DoF: XREAL Air 2 Ultra, RayNeo X3 Pro, Snap Spectacles. Closer to AR headsets, heavier, but they point toward the future.

Why 2026 matters

The way I see it:

  • 2025 proved the concept. AI glasses showed they can deliver daily value.
  • 2026 is the breakout. Multiple promising devices will be available, and adoption will start to scale.
  • 2028 is the year everyone points to for Apple’s glasses release, but it’s still just a rumor. Meta’s own Project Orion is also expected in the next 2-3 years. Both timelines suggest full AR is coming, but no one can guarantee when.

My takeaway

Using the Even G1 every day has shown me what AI glasses are good for right now.

Subtitles at a restaurant, live translation when traveling, or a teleprompter in a meeting. These are small, simple interactions that don’t need a full AR headset.

That’s why 2026 feels like the breakout year. AI glasses will stop being prototypes and start being worn daily, the way smartwatches quietly became essential.

Whether Apple or Meta hit in 2028 or later doesn’t matter as much.

The real shift is already starting: AI glasses are becoming part of everyday life, and they’re the bridge to full AR.

Join the MeshMap Community

Get Started