Read more at source.
Read more at source.
Visual Intelligence, an AI feature that debuted on the iPhone 16, works with the phone's camera to perform tasks such as adding event details from a flyer to your calendar or looking up information about a restaurant. Apple plans to extend this functionality to its wearables, aiming to power the feature with its own in-house AI models by 2027.
The development and integration of AI features into Apple's wearables will heavily depend on Mike Rockwell. Recently assigned to get the delayed Siri LLM upgrade on track, Rockwell was previously in charge of Vision Pro and is expected to continue working on visionOS, a software that will power another Apple wearable with a significant AI component.
Apple's long-term vision includes the release of AR glasses, a concept similar to Meta's Orion. These glasses, expected to be powered by visionOS and equipped with a substantial AI component, represent the next step in Apple's journey to create a future where technology and everyday life are seamlessly integrated.
With the integration of cameras and AI, Apple's wearables will not only perceive the outside world but also deliver relevant information to the user, creating a seamless interface between technology and everyday life.