Apple apologizes for having contractors listen to Siri recordings and announces privacy changes

Apple logo in squares

Following outcry over human "grading" of Siri recordings, Apple has issued an apology and promised that it will no longer retain recordings of interaction with the digital assistant unless given explicit permission.

The company says that "we haven't been fully living up to our high ideals, and for that we apologize". Having suspended the human grading of Siri requests, Apple is now making fundamental changes to its privacy policy saying that only Apple employees will be able to listen to recordings, as opposed to contractors, and users will have to opt in for this to happen.

See also:

Apple plans to resume the Siri grading program later in the fall, but says that unless users opt into human grading, recordings will only be analysed by computer. The company also stresses that "when we store Siri data on our servers, we don't use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private".

Keen to get across the notion that privacy has always been at the heart of the digital assistant, Apple says that "Siri uses as little data as possible to deliver an accurate result", and stresses that random identifiers are used to anonymize users.

Apple outlines three key changes it will be making in response to the concerns that have been expressed about privacy:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

A software update later in the year will be used to add the new opt-in option. More information is available on the Siri Privacy and Grading support page.

Image credit: Lori Butcher / Shutterstock

4 Responses to Apple apologizes for having contractors listen to Siri recordings and announces privacy changes

  1. TechFan says:

    This is all good, but lets be honest, the entire tech industry was doing it, and doing it for a reason (not to spy), and that was to improve cutting edge services. I saw it's all good, because they probably got 80% of what they needed over the years to make it better. And it will improved (at a slower rate) with a lot of people doing the opt-in.

    I do like the first one, and think all tech companies should do this - there is no reason to retain old audio clips.

  2. realDonaldTrump says:

    This would have been a dream job.

  3. They all do evil, some of it is inadvertent, much is intentional and profit motive based.

  4. Georgia Little says:

    I wonder if this was the reason Cook cashed out a huge amount of Apple stock?

© 1998-2020 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.