Mobile News

Apple’s AI glasses are coming, but not the way you think



Apple is reportedly accelerating development on their new wearable: AI-powered smart glasses designed to compete directly with Meta’s Ray-Bans. According to Bloomberg, the Cupertino-based company is targeting a late 2026 launch and has begun preparing large-scale prototypes for production.

This new report clarifies and updates earlier information reported by our publication, which speculated that Apple would debut its smart glasses alongside custom chips based on Apple Watch architecture. At the time, sources suggested that chip production would begin in summer 2026. However, Bloomberg now confirms that Apple aims to start producing significant quantities of the smart glasses themselves — not just their components — by the end of 2025, signaling a more aggressive timeline than previously expected.

While Apple’s futuristic AR glasses remain years away, this new development, codenamed N401, will bring real-world context to Siri in a more stylish form factor than a headset. Think of them as “Vision Light”: Smart glasses with onboard cameras, speakers, and microphones capable of live translation, call handling, and turn-by-turn directions. The glasses will not feature augmented reality displays unlike the Vision Pro headset. Instead, they rely on audio and voice feedback which is a deliberate move to reduce complexity, cost, and bulk. This confirms earlier reports suggesting that Apple is working on both an AR model and a non-AR model, with N401 clearly representing the latter.

Apple’s glasses are part of a broader strategy to stay relevant in the rapidly growing market for AI-enabled devices. Meta and Google are already in the same, and OpenAI’s newly announced hardware partnership with Jony Ive is set to crowd the field further.

Also worth noting: This latest report makes no mention of specialized chips named in the prior leak, Glennie for AirPods and Nevis for Apple Watch, though it does reiterate that Apple is facing challenges around integrating camera and sensor data in a lightweight frame. The concept of offloading processing to the iPhone, also mentioned in earlier reporting, still holds relevance as Apple continues to wrestle with balancing performance and power efficiency.

What about Apple’s other wearable experiments?

Not all ideas are making the cut. Apple has reportedly shelved development of a smartwatch with an integrated camera and real-world analysis capabilities. That device was meant to bring environment-sensing features to the wrist but has been scrapped due to technical hurdles and privacy concerns.

Internally, the smart glasses project is said to be facing some of the same AI limitations Apple has struggles with elsewhere. While Meta’s Ray-Bans benefit from Llama, and Android XR glasses lean on Google’s Gemini, Apple has so far relied on third-party AI like OpenAI or Google Lens for visual understanding on the iPhone. Analysts expect Apple to debut its own proprietary models soon, potentially alongside the new smart glasses.

If all goes as planned, Apple’s AI glasses will serve as a the company’s first real foray into AI-first wearables and promises to redefine how we interact with everyday life. Whether it can beat Meta at its own game remains to be seen, but Apple isn’t content to sit on the sidelines.








READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.