What you need to know
- Meta announced that it is opening the doors for other developers to start building mobile and web apps for Ray-Ban Display.
- The company will make it available for Developer Preview in the next few weeks; however, developers can get a head start before entering.
- Ray-Ban Display is expected to get the Meta’s artificial intelligence, the Muse Spark, this summer, and the Connect 2026 will be on its way this September.
Meta announced later this week that embedded lens glasses are opening their doors to developers who want to build the best apps for it.
a press releaseWhile developers are experimenting with designs for AI glasses, Ray-Ban Display is now giving them two ways to build future applications, Meta says. Meta expands access for developers (via Developer Preview) to start building mobile and web apps. The company adds: “Whether you’re extending an existing iOS or Android mobile app or building something entirely new, you can create screen experiences using familiar tools.”
This emphasizes that developers don’t have to worry about building a development kit from scratch because the environment – the platform – is already there. Devs can start early Here’s what’s coming to Ray-Ban Display as availability continues to roll out over the next few weeks.
The Meta Wearables Device Access Toolkit is where developers will find what they need for mobile apps on the Ray-Ban Display. It is a native SDK for Android and iOS that allows developers to extend their apps on the device screen. there will be tools to add UI features such as “reading text, images, lists, buttons and videos”. On the other hand, the new way of web applications allows developers to build using HTML, CSS and JavaScript.
A big part of the Ray-Ban Display is the neural band that uses your hand movements to complete movements on its lens screen. Developers can now add informative overlays, real-time data (think sports scores), media streaming, and more. they can take advantage of it by adding
More is coming
This is the second piece of good news for Ray-Ban Display this week it was all about the Muse Spark. It’s better to say that this AI announcement is only half for the Screen. Earlier this week, Meta announced its new LLM, Muse Spark, aimed at artificial intelligence glasses. It gives users access to a smarter, more accurate AI model that can handle tasks/queries with speed and accuracy thanks to the use of multiple AI agents.
This is rolling out for Gen 1 and Gen 2 of Meta’s AI glasses; however, Screen will have to wait until this summer to get it. That’s not all, because Meta Connect 2026 It was confirmed that it will take place on September 23-24. There weren’t many details, but the main topic of discussion was artificial intelligence, VR, wearables and more.
Android Central’s Take
This can only be a good thing for the Meta. However, it feels like the company is trying to ensure it can keep up with the changing tide. More and more companies are moving forward with apps that are interested in their AI glasses and the developers behind them. If the meta wants to stay relevant and in contention, it’s a good idea to open the doors to developers outside of themselves.





