After many months of speculation, Google finally showed off its still-early-day Android XR smart glasses prototype. It was an impressive live demo, with a live translation portion that went off well but not without hitches. Still, it got the crowd at Google I/O going, and right after that opening keynote wrapped, I strolled around the Shoreline Amphitheater to find a pair to try.
Much like my time with Project Moohan, the prototype Android XR headset that Google and Samsung are working on, I only spent about five minutes with these prototype glasses. And no, it wasn’t a sleek frame made by Warby Parker or a wild one from Gentle Monsters – instead, it was the pair Google demoed on-stage, the prototype Android XR glasses made by Samsung.
As you can see above, much like Meta Ray-Bans and unlike Snapchat Spectacles (the first gen), these prototypes look like standard black frames. They’re a bit thicker on either the left or right stems, but they’re also loaded with tech – though not in a way that screams it from the outside.
It was a short, pretty rushed demo, but certainly a compelling one.
The tech here is mostly hidden – there is a screen baked into the lens, which, when worn, appears as a little box when it’s showing something larger. Otherwise, when I first turned the glasses on, I saw the time and the weather hovering at the top of my field of vision.
When I pressed the button on the right stem to capture a photo, it almost flashed transparently larger in my field of vision. Neat and a bit more present way of capturing than on the screen-less Meta Ray-Bans.
These are both cool, and during the keynote, Google also shared that the screens could be used for messaging, calls, and translating as well, but I didn’t get to try that. While I couldn’t ask for directions myself, a Google rep within my demo was able to toss up what navigation would like, and this feature has me more excited about smart glasses with a screen built-in.
Why? Well, it was that the experience of navigating doesn’t get in the way of my field of view – I can simply still look straight forward and see at the top that in 500-feet or 50-feet that I need to make a right onto a specific avenue. I don’t need to look down at my phone or glance at my wrist, it’s all housed in just one device.
If I need more details or want to see my route, I could glance down to see a mini version of the map, which moved as I moved my head. If I wore these in NYC, I could walk normally and glance at the top to see directions, but when safely stopped and not in the way of others, I could look down to see my full route. That’s pretty neat to me.
The projected screen itself had good-enough quality, though I’m not sure how it performs in direct sunlight, as I tested these in a little room that Google had constructed. It’s important to remember that this is still a prototype – Google has several brands onboard to produce these, but there isn’t an exact timeframe. Developers will be able to start developing and testing by the end of the year, though.
This year, the Project Moohan headset, which also runs Android XR, will arrive. Samsung will ship the headset in a to-be-revealed final version, which could build support from third parties and let Google get more feedback on the platform.
Gemini, Google’s very wise AI assistant, blew me away on Project Moohan and was equally compelling on the Android XR glasses. I asked it for the weather, and got it to give me an audio report of the next few days, had it analyze a replica of a painting, and even look at a book, tell me the reviews, and where I could purchase it.
That power of having Gemini in my frame has me really excited for the future of the category – it’s the audio responses, the connection to the Google ecosystem, and how it plays with the onboard screen. It remains to be seen how Samsung’s final design might look, but it will likely sit alongside several other Android XR-powered smart glasses from the likes of Warby Parker, X-Real, and Gentle Monster, among others.
I’ve long worn Meta Ray-Bans and enjoy those for snapping unique shots or recording POVs like walking my dog Rosie or riding an attraction at a Disney Park. Similarly, I really enjoyed the original version of the Snapchat Spectacles, but the appeal wore off. Those both did only a short – or in the case of the Spectacles, very short – list of functions, but Android XR as a platform feels a heck of a lot more powerful, even from a short five-minute window.
While the design didn’t sell me on Samsung’s prototype, I have high hopes for the Warby Parker ones. Seeing how Gemini’s smarts can fit into such a small frame and how a screen can be genuinely useful but not overly distracting really has me excited. I have a feeling not all of the Android XR glasses will appeal to everyone, but with enough entries, I’m sure one of them will pair form with function in a correct balance.
Gemini in glasses feels less like the future, and considering this new entry, my eyes are set to see what Meta’s does next and what Apple’s much-rumored entry into the world of smart glasses will look like.