
In short: apple According to a Bloomberg report published by Mark Gurman on April 12, 2026, it is testing at least four frame styles for its upcoming AI-powered smart glasses. Designs include a large rectangular style similar to the Wayfarer frames, a slimmer rectangular style comparable to the one worn by CEO Tim Cook, a larger frame, an oval or a small round option. The frames are made of acetate rather than standard plastic. In addition to two cameras, the device uses the N401 chip, a special processor based on the Apple Watch S-series architecture: one for taking photos and videos, and the other for computer vision. The first version does not have a screen. Apple is aiming to start production in December 2026, with a public launch expected in the spring or summer of 2027.
Four styles, one material
The four frame designs cover different aesthetic registers. The large rectangular variant resembles the classic Wayfarer-style frames, with wide consumer recognition and a basic wearable shape. A slimmer rectangular option is similar to the frames worn by Tim Cook, one of the options tested was clearly calibrated for professional wear. The two oval formats range from a larger, more expressive shape to a smaller, minimal design. The breadth of options being tested shows that Apple is not yet committed to a single visual language for the product, and is gauging which combination of shapes appeals to the widest range of potential users, in line with how Apple typically approaches early watch designs before settling on a single case.
The material is acetate, It is described in Gurman’s report more durable and luxurious than standard plastic, a distinction that positions the glasses not against budget-level wearables, but against Meta’s existing Ray-Ban line. Colors confirmed in the testing phase include black, ocean blue and light brown, and many more options are expected before the final product is announced. The camera module on the front of the frame uses a vertically oriented oval arrangement with indicator lights surrounding it, a configuration that differs from the horizontal lens placement used on the Meta’s Ray-Ban models and is intended to identify the device as an Apple product from a distance. Apple is aiming for a weight under 50 grams and all-day battery life. A price of about $499 was reported on the secondary scope, though Gurman did not confirm a figure in its April 12 report.
Hardware and intelligence
The N401 chip is a custom low-power processor derived from the Apple Watch S-series architecture, optimized for in-device performance within the thermal and battery limitations that a single bezel can accommodate. The glasses include two cameras: the main camera handles photo and video recording; the second is dedicated to computer vision, providing a real-time environment for Siri and Apple Intelligence without requiring the phone to be picked up or unlocked. Microphones and location sensors are also integrated. The first version has no display, which means that all information reaches the user through speakers or the iPhone screen, and the glasses use the iPhone for any computationally intensive processing that cannot be handled on the device.
The main interface is Siri, which will handle notifications, music playback, phone calls, live translation, and visual intelligence queries about the user’s surroundings. The version of Siri that these glasses will work with is the overhauled assistant that Apple announced in January 2026, powered in part by a special Gemini model developed in partnership with Apple and Google. The main system has been in development for some time: Apple Intelligence accidentally launched in China A moment that confirms the software is ready while marking the compliance surface Apple must navigate in markets outside the United States before regulatory approval is granted by the Cyberspace Administration on March 30, 2026.
Apple enters the market
The smart glasses category that Apple is poised to enter has been commercially validated by Meta over the past two years. Meta sold more than seven million Ray-Ban and Oakley AI frames in 2025, more than tripling its 2024 volume in a product category that barely existed three years ago. The latest iteration of Meta’s strategy expanded the product to corrective eyewear: Meta launched Ray-Ban prescription smart glasses to reach billions of eyewear buyersa direct bid to convert the approximately 69% of the global eyewear market that requires corrective lenses and excludes standard smart glasses. Apple updated the MacBook Air with the M5 in March 2026, the latest major hardware release, continuing Apple’s streak of incremental but commercially significant hardware updates across product lines, even as the smart glasses project moved from research to active engineering prototypes.
Google is also set to enter the category with Warby Parker and Gentle Monster via its Android XR platform in 2026, aiming to launch both audio-only and display-equipped variants before Apple. Apple’s entry in 2027 means that after Meta has established commercial viability and Google’s first models hit the market, a position Apple previously held: it came after BlackBerry in smartphones and after Fitbit in wearables, and in both cases produced a product that changed what users expected from the category. The scale of Apple’s current iPhone user base is more than one billion active devices, giving its glasses a distribution advantage that neither Meta nor Google can replicate from a standing start.
A broader strategy
The four frame styles and technical specifications are part of a three-pronged AI wearable plan that Gurman announced in February 2026. In addition to smart glasses, Apple is developing a camera-equipped AI pendant about the size of an AirTag, designed to be clipped or attached to clothing and continuously deliver visual context to Siri. The company also makes AirPods equipped with a camera. All three devices are designed to work together as environmental input channels for Apple Intelligence, creating a distributed sensor layer that extends Siri’s awareness to different positions on the body rather than concentrating it on a single device. The pendant and camera AirPods are in earlier stages of development than the glasses, with the pendant a likely candidate for a 2027 release and the AirPods cameras likely coming next year.
Apple’s bet on ambient AI hardware reflects a broader pattern of how the tech industry allocates resources. In March 2026 Meta Reality Labs has cut hundreds of jobs in recruiting and sales While Ray-Ban is also expanding its investment in hardware and raising production targets, the economics of moving to AI hardware require companies to redeploy from traditional categories rather than simply increase headcount. TNW reported in August 2025 The next generation of AI unicorns may hire no oneanalyzes the structural change in which companies are built with significantly smaller teams achieving more through AI capabilities. Apple’s wearables strategy is a consumer-facing expression of the same dynamic: hardware that expands personal capabilities through ambient intelligence, is built for the iPhone ecosystem that already provides a processing foundation, and is designed to be worn all day rather than carried in a pocket or on a desk.




