▲ | sceptic123 5 hours ago | |||||||
Can you explain how you know that trusted headphones aren't necessary and where Apple is saying what you are quoting here? | ||||||||
▲ | STKFLT 3 hours ago | parent [-] | |||||||
Those are fair questions. This is what Apple says in the press release: > Live Translation with AirPods uses Apple Intelligence to let Apple users communicate across languages. Bringing a sophisticated feature like this to other devices creates challenges that take time to solve. For example, we designed Live Translation so that our users’ conversations stay private — they’re processed on device and are never accessible to Apple — and our teams are doing additional engineering work to make sure they won’t be exposed to other companies or developers either. We know it isn't necessary because Apple believes it is possible and are working on it. That's a pretty good indication that Airpods and their associated stack are currently being treated differently for a feature which fundamentally boils down to streaming audio to and from the headphones. It's not even clear how 'securing' live translated audio is any different from 'securing' a FaceTime call in your native language. I think a reasonable reading sans more technical information from Apple is that they give Airpods more data and control over the device than is necessary, and they want us to be mad at the DMA for forcing them to fix it. | ||||||||
|