Google Glass Is Back: Here’s What to Expect

Last December, I’m wearing Google Glass in several ways during development. Soon you will be able to get your hands on the final versions. When, exactly, and how much? We may find out more in just a few days.
While Meta was a big tech company aiming to get a place in your face in the form of mirrorsfar from one. Google is about to enter the race with a whole range of smart glassesthe company’s first return to everyday facial technology since Google Glass in 2013.
This time, the focus is entirely on AI. Gemini it will be the main reason and function of what makes Google’s Android XR glasses work, but they will come in different designs: Warby Parker, Gentle Monster, Kering Eyewear and Samsung are all expected to have their own models. Xreal, the manufacturer of the glassesit will have an additional mixed reality plug-in device called Project Aura, too.
This year’s Google I/O developer conference May 19 is just around the corner, and we should hear a lot more about Google’s strategy for smart glasses. But we already know a lot, since Google talked about and demoted these glasses last year. Now that we’re in 2026, all these glasses should finally arrive, and if you’ve ever thought about getting smart glasses, you’ll want to see what they’re all about.
Watch this: What to Expect at Google I/O: Glasses, Glasses, Glasses
All about Gemini
Google, Samsung and Qualcomm have been working together on Android XRa new OS for a whole range of mixed reality headsets, AI glasses, display glasses and finally augmented reality glasses. The first product of this collaboration, Samsung Galaxy XRhe arrived last fall.
The Galaxy XR is very powerful A VR headsetbut also mixed reality computing, such as Apple Vision Pro as well as Meta Quest 3. It runs Android apps with its Android XR OS, and has Gemini AI that can respond to voice, and work live to see anything on your device’s screen and in the real world with its external cameras.
That Gemini on-tap assistant is exactly what will be the key app for the next wave of smart glasses. Like Meta’s Ray-Ban and Oakley glasses, which use Meta AI, Google’s glasses will use Gemini and related Gemini apps like There’s Banana again NotebookLM.
Pop-up information on enabled glasses will provide contextual information, such as live map data.
The mirrorless glasses will use built-in microphones and speakers to respond to AI prompts, handle live language translation, or play music and calls. The camera can take photos and videos, or activate Gemini Live mode for continuous recording and AI awareness of the world.
An additional line of display-enabled glasses, with a color display on one lens, will display snapshots taken from the glasses, display phone notifications, play videos or provide live captions or translations. Other apps will work on the glasses as extensions of what you’re doing on your phone: Google Maps can show directions and maps displayed down in front of you by tilting your head, or Uber can show the driver’s status.
CNET’s Patrick Holland tested a model of the glasses last year, also at Google I/O.
Three (or more) design partners
Warby Parker, Korean fashion eyewear brand Gentle Monster and European eyewear brand Kering have become official Android XR eyewear partners, meaning all three will launch Android XR eyewear lines. Expect a lot of designs and fashion riffs, similar to how Meta’s glasses partner EssilorLuxottica does a lot of frame designs under it. Oakley again Ray-Ban brand.
Gucci smart glasses expected with Kering, and there will be many surprises. Also, Samsung may be in the mix. Although Samsung is already a partner that helps make all these other glasses (probably by providing the camera and display components), Samsung will announce its Android XR glasses at some point, too.
Add to the mix Xreal, a maker of USB-powered display glasses, which is making its own Android XR minicomputer called Project Aura (more on that below).
As with many of Google’s partnerships with watch products over the years with Android Wear, many types of glasses can fit.
Project Aura, made by Xreal and Google, is a pair of glasses that can run Android XR apps as a full mixed reality headset. They are part of what is coming next year.
A different kind of AR glasses, Project Aura
The glasses made of Xreal work differently from other smart glasses, acting more like a small VR headset than an all-day set of eyewear. The Aura Project a special set of Xreal glasses with a larger display and additional cameras connected to a phone-sized processing puck. To wear them (which I did last year), you can use apps and 3D experiences and use hand tracking like a VR headset.
Project Aura runs the same operating systems as the Galaxy XR and uses the same chipset. It’s really a sort of reduced mixed reality experience, intended to serve as a development tool for future Google AR glasses that may connect directly to phones and real products. But it doesn’t mean it’s worn all day. Instead, like other Xreal glasses, it’s a sort of wearable “ears for your eyes” that can amplify the displays around you on the go.
The big difference: How well they’ll work with Google and Android
Google’s biggest advantage with Android XR should be how well these devices work with AI apps you may already be using or with apps on your phone. For Android phones, these should feel deeply integrated with the phone’s controls and apps, just like a smartwatch. For iOS, they should also work with Gemini services.
There aren’t yet any everyday smart glasses that connect so deeply with the phones in our pockets, and Google’s should be the first. Apple may follow next year with its own glasses.
Google has already said that phone notifications should appear as interactive widgets on glasses, but will more apps build deeper hooks? And will more AI be allowed beyond Gemini? Currently, Google has said that Gemini is the main AI service for its glasses. But these glasses will also work with WearOS watchesagain.
Will it know who is wearing these glasses, and how comfortable will AI privacy policies feel?
Will Google solve the problems of privacy and public acceptance?
Meta has run into trouble repeatedly for its handling of users’ personal data, as well improper public use of its smart glass cameras has led to social media attacks. Meta’s AI privacy policies are vagueand Meta is not a reputable company for social media security or privacy, for very good reason.
Will Google do better? It’s considered reputable, but it’s also a company that already integrates ads into our personal data and is increasingly ingesting more data, like health and fitness, into its connected AI services. Google will have to explain how responsible it will be for the glasses going forward, and overcome the elements of public acceptance. Will the “Glasshole” moniker come back to bite him?
The price and release date are unknown
We don’t know when these glasses will arrive, other than “sometime in 2026.” But expect more news from Google I/O on May 19. I’ll be there, and we’ll be reporting on all the AI and smart glasses news as it happens. We should know more then.



