Tech & Innovation - March 23, 2025

Apple's Vision for AI-Enhanced Wearables by 2027

Image related to the article
Apple's plan to integrate cameras into its wearable technology is a significant step towards a future where AI interfaces seamlessly with our everyday lives. The cameras, concealed within the display for the standard Series Watch and on the side for the Apple Watch Ultra, will allow the device to perceive the outside world and deliver relevant information to its user. This innovation is also expected to be part of the rumored camera-equipped AirPods.

Read more at source.

The Role of Visual Intelligence

Visual Intelligence, an AI feature that debuted on the iPhone 16, works with the phone's camera to perform tasks such as adding event details from a flyer to your calendar or looking up information about a restaurant. Apple plans to extend this functionality to its wearables, aiming to power the feature with its own in-house AI models by 2027.

The Leadership of Mike Rockwell

The development and integration of AI features into Apple's wearables will heavily depend on Mike Rockwell. Recently assigned to get the delayed Siri LLM upgrade on track, Rockwell was previously in charge of Vision Pro and is expected to continue working on visionOS, a software that will power another Apple wearable with a significant AI component.

Looking Towards the Future: AR Glasses

Apple's long-term vision includes the release of AR glasses, a concept similar to Meta's Orion. These glasses, expected to be powered by visionOS and equipped with a substantial AI component, represent the next step in Apple's journey to create a future where technology and everyday life are seamlessly integrated.

With the integration of cameras and AI, Apple's wearables will not only perceive the outside world but also deliver relevant information to the user, creating a seamless interface between technology and everyday life.