Let’s just start out with this, Meta is onto something. After the first 24 hours with my very own pair of Meta Ray Ban Displays, I must say, that Apple needs to get it together. I was already a big fan of the Meta Ray Ban glasses, frequently wearing them to listen to podcasts or take photos on the go. But like every nerd, I’ve always dreamed of having that perfect heads up display for contextual information. 13 years after Google Glass, Meta and Ray Ban have finally delivered on that original vision (pun totally intended).

I was able to try out my friend’s pair of Meta Ray Ban Displays a couple of days ago and walked away surprised by how easy and natural they felt to use. A few days later, I still feel that way. The neural band is clearly a triumph of engineering, it’s incredibly responsive, and the gestures that it enables are dead simple. It’s also comfortable on the wrist, feeling no different than something like a Whoop or a Fitbit. If anything I wish it could do more, like also track basic health metrics. Despite the limited use cases enabled by the band, it’s still worth wearing simply because of how magical the experience is of using the Ray Ban Displays. The gestures are so well designed and work seamlessly. Dare I say, they may be a better longterm solution than the Vision Pro’s often exhausting eye tracking.

At first I was worried about the monocular screen, but I’ve quickly gotten (mostly) used to it. It’s incredibly easy to see, more than bright enough both indoors and outside, and provides enough color to still look vibrant on any background. There’s no sort of flicker or stutter, it’s a clear and crisp transparent display. It’s fun to look at it, but it’s also practical. I am amazed by how much more natural it feels to walk down the street and use the glasses with your eyes up and hands down instead of my phone. I was still present, looking up and out at the world, and no one could see that I was texting. At one point I was walking down the street, spotted a coffee shop across the way, and simply popped up the maps app on the glasses to see details. It was exactly the kind of interaction I’ve always dreamed of.

As for what you can actually do with the display in these glasses, the list is still fairly short. That’s not bad, there’s still enough to do with them to justify wearing them all day. But I can already spot dozens of use cases that I want as soon as possible. At the moment, you can text, call, take photos and videos, search maps, use live captioning, play a built-in game, control your music, and talk to Meta AI. It’s really a lot like a smartwatch on your face. I love being able to see rich answer cards when asking AI questions rather than having to rely on the audio output of the previous generation glasses. It’s also a helpful workaround while we wait for more apps. You can ask it about news, information, have it generate content, and so on. There’s no web browser, no Instagram feed, no notes or todo lists, no calculator, no YouTube and so on. All of this is to say, there’s a ton of room for both Meta to build on the experience but also for third-party developers to go all out. I am itching to be able to install other apps on these things. Frankly, I won’t be surprised if they go down the vibe code route at some point to generate applets on the fly too.

One thing that I need to be super clear about is that these are not really AR glasses. They are glasses with a HUD. There aren’t any true augmented reality experiences that are able to take advantage of the space around you. I can understand why an AR or VR fan would be disappointed in these, but for average user adoption I think they made the right call. We’ll get there eventually, but I’d much rather have this sort of practical experience on my face than some gimmicky demos. AR just isn’t useful enough yet as much as some hate to hear it.

If you’ve used the classic Meta Ray Bans, you know that the speakers are okay but not great. They’re serviceable but nowhere near the quality of earbuds. Thankfully they seem to be much improved on these new glasses. The sound is richer and fuller. Meta’s new charging case is also a massive upgrade over the previous one. It folds flat when not in-use and easily slides into a bag. It also unrolls entirely flat for easier access. One thing I find particularly fascinating though is that the new case has the Meta logo while the previous one had the Ray Ban logo on the front.

While wearing these out and about, I’ve certainly gotten some looks. There’s no question that some people have clocked them as not-your-average-glasses. They’re noticeably chunkier than the regular Meta Ray Bans, but not so much so to look dorky. If anything, they sort of look retro. Fortunately, like the regular Meta Ray Bans, the cameras are excellent and look very similar to the output I get from my phones. The cameras remain the clearest indication you’re wearing smart glasses.

I fully expected to have to wait many weeks to get my hands on a pair of these glasses, but fortunately I was in the right place at the right time. Completely by chance, I was able to get exactly the pair I needed. I’m glad that I was able to get them, I’m an early adopter and always have been. Unlike the Vision Pro though, these feel less like a simulator for the future and more like the future now. $799 is a steep asking price, but for what you get it’s well worth it. I say this to my nerd pals, if you’d spend $799 on an Apple Watch Ultra you shouldn’t gawk at $799 for these glasses. It’s surprisingly reasonable, but Meta knows it has an early advantage and I’m sure has priced these competitively. There’s no way they are making much, if anything, on these. I think that highlights just how far ahead Meta is, that they’re willing to give it their all to win. Now that we’ve seen that Apple is pivoting from Vision Pro to glasses, it’s clear they are on the right path.