Read more at source.
Read more at source.
As explained by Bloomberg's Mark Gurman, the live translation feature can translate a conversation from, for example, Spanish to English, by playing a translated version of the speech into the English-speaker's AirPods. When the English speaker responds, their iPhone will then play the Spanish translation from its speakers.
In addition to the live translation feature, Apple is also working on an AI upgrade for Siri, although the company recently announced a delay, stating that it will take longer than anticipated to deliver these features. Apple is also planning a design overhaul for iOS, iPadOS, and macOS this year.
Google has had live translation in its Pixel Buds for years, and it remains to be seen how Apple's implementation will compare. Google's experience in this area could give it an edge, but Apple's reputation for seamless integration of hardware and software could make its translation feature a strong competitor.
Apple is a little behind others in adding live translation to the AirPods, as Google brought the feature to its very first pair of Pixel Buds in 2017, and later expanded it to the Pixel Buds Pro in 2022.