Meta Brings Third-Party Apps and Games to its Display Glasses

When I reviewed the Meta Ray-Ban Display glasses, one of my disappointments was that there were only a few applications that could use the good display of the frames and they were all made by Meta. Now, that is finally changing.
Meta opens up the glasses to third-party developers, who can now test display-enabled applications and control the Neural Band device. The platform will work with apps that have an iOS or Android mobile app and will also work with web apps, the company said in an update.
With a 20-degree field of view, the glasses display is not as fully focused as other standalone AR glasses. And it sounds like Meta is looking for apps that would be a good fit for a single voice display like “information overlays.” For example, Meta CTO Andrew Bosworth shared a video of one early app called “Darkroom Buddy,” an interactive guide to developing film that can serve as a “look” reference.
The gap between the idea and the prototype has never been smaller. Add mirrors and inputs like the Neural Band, and it feels like the early days of construction in a way we haven’t seen in over a decade.
We are releasing web apps and mobile SDK for Meta Ray-Ban Display. Developer Preview… pic.twitter.com/OlDayAkozd
– Boz (@boztank) May 14, 2026
The company also suggests that developers can create media streaming experiences, “real-time data displays, such as scores or status updates” and other features of “small applications.” The company is also apparently considering smaller games. The video shared by Meta also includes several such examples, such as a chess, snake and brick breaker style game. (Glass already comes with one Meta puzzle game, though I didn’t find it compelling.)
The addition of third-party apps could unlock more functionality for the $800 glasses the company launched last fall. Since its launch, the Meta has added some of its new features like a built-in teleprompter and handwriting capabilities, but the glasses still feel limited. For example, I was eager to use the glasses while cooking, but frustrated that my only option to see the recipe on the display was to ask the Meta AI for one. I doubt any major recipe app developers are thinking about this device right now, but I’d like to know that it’s at least possible to do so.
It is not clear when exactly these third-party apps will be available. Meta announced third-party app support for its non-display smart glasses last year, but many are still not available. One thing I’ll be keeping an eye on is how this new experience affects battery life, as I’ve found that apps that display too much can drain the glasses’ internal battery quickly.
The timing of Meta’s announcement is noteworthy. The company recently announced the dates for its next Connect event, where we are likely to get more updates on all of its smart glasses systems. CEO Mark Zuckerberg also teased new glasses that could be the next iteration of the Meta Ray-Ban Display frames.
For people who already have pairs, Meta is also rolling out a few key updates that are already available. The previously mentioned “neural handwriting” feature, which allows you to respond to messages by tracing letters with your fingers, is now available to everyone. The company also adds recording capabilities that allow you to record what’s happening in the mirror around your vision and share it with others. I’ve personally found it difficult to explain to people exactly what it’s like to use a head-up display, so I’m looking forward to trying it out. And, finally, Meta brings live highlights to calls made via Messenger, WhatsApp and Instagram.



