In mid-September just three months ago, Tech giant Meta announced an all-new version of their “Smart Glasses” which don’t just feature a camera and AI assistant, the Meta Ray-Ban Display also feature and actual display projected onto the lens of your glasses that only the wearer can see and interact with.

One month after they were announced, I was in San Fransisco for another event and figured given they were only being sold in the USA I’d snag a pair there. I tried to book a “demo” because you can’t just walk in and buy them, at both a Ray-Ban shop and a third party listed on the Meta site. These venues were in San Francisco, an hour from Meta’s headquarters. Likewise locations around Silicon Valley – no available demo slots until December.

So with CES 2026 scheduled for early January, I took the chance to book a Demo at a Best Buy store in Los Angeles when I touched down for my trip to the annual tech-fest in Las Vegas.

Frankly, I didn’t need a demo, I just wanted the glasses, so I had my AMEX primed and ready – no matter the outcome of the demo.

And, given it’s a third party store, I did wonder how good the demo might be.

To Meta’s credit – outstanding, the staff member helping me at Best Buy (named Israel) was fantastic. Patient with me, willing to let me discover and not pushy for time.

He offered me two pairs to try out, the standard and large sizes. For me the standard fit was perfect. Then he measured my wrist, and of the three bands available I was a Size 3. This band, as it turns out, is I think more than 50% of the brilliance of the Meta Ray-Ban Display as a whole.

My son was able to watch on, and see what I was seeing through a smartphone “casting” of the screen, and capture some of that on camera.

In simple terms, this was remarkable to me.

The projected screen was clear, clearer when I shut my left eye, but over time you adjust – as someone who wears glasses to read I was blown away that my eyes just worked for a screen so close!

Unique to these glasses is the Meta Neural Band – on your dominant wrist, this allows you to use simple gestures to control the on-screen menus.

And it – just works.

Tap your index finger and thumb to click, tap your middle finger and thumb to go “back”.

What was more remarkable is the navigating up, down left and right. You’re asked to make a relaxed fist, and use your thumb over the cupped index finger to “swipe” up, down, left and right.

Sceptical, I gave it a go. It worked.

The shock, and frankly impressive reaction to this was just like what I felt navigating Apple’s Vision Pro headset for the first time. It’s so good.

Take a photo, you see it on screen, and then can look at taken photos in your library, but then tap, and twist your wrist left and right to zoom in and out.

But even more, when zoomed, I could move my arm around to change the zoomed position within the whole image frame.

By this point, I was sold.

“Sign me up” I said.

For sure, I was told, we can add you to the wait list. Um, No, take my money. The $800 price tag seemed a bargain for what this can do.

No can do, March or April was the likely wait time. Sad Face.

Look, They are remarkable, very cool, and a great demonstration of interactive smart-tech.

But I wouldn’t wear them. They are thick, ugly, and I just can’t imagine wearing them as regularly as I do my standard Ray-Ban Meta glasses. You see those glasses just look like normal sunnies. These do not. Far more of a fashion statement look than any other pair of normal glasses.

I do however want to own them, if not just to show people the future – with their own eyes.

No word at all on when these will release globally, or be in full supply in the USA, but I’ll keep trying:)