Apple apologizes for having contractors listen to Siri recordings and announces privacy changes
Following outcry over human "grading" of Siri recordings, Apple has issued an apology and promised that it will no longer retain recordings of interaction with the digital assistant unless given explicit permission.
The company says that "we haven't been fully living up to our high ideals, and for that we apologize". Having suspended the human grading of Siri requests, Apple is now making fundamental changes to its privacy policy saying that only Apple employees will be able to listen to recordings, as opposed to contractors, and users will have to opt in for this to happen.
See also:
- Apple puts the kibosh on vulnerability that let iPhone users jailbreak iOS 12.4
- Privacy: Google stops transcribing Assistant recordings and Apple stops listening to Siri recordings
- Privacy: Apple workers may well hear all of your sordid secrets via Siri
Apple plans to resume the Siri grading program later in the fall, but says that unless users opt into human grading, recordings will only be analysed by computer. The company also stresses that "when we store Siri data on our servers, we don't use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private".
Keen to get across the notion that privacy has always been at the heart of the digital assistant, Apple says that "Siri uses as little data as possible to deliver an accurate result", and stresses that random identifiers are used to anonymize users.
Apple outlines three key changes it will be making in response to the concerns that have been expressed about privacy:
- First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
- Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
- Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
A software update later in the year will be used to add the new opt-in option. More information is available on the Siri Privacy and Grading support page.
Image credit: Lori Butcher / Shutterstock