Apple suspends program that let its employees listen to your Siri recordings

Key Points
  • Apple is suspending a program that let staffers listen to a fraction of the voice recordings uploaded anonymously by Siri to its servers.
  • Apple called it "grading," used for improving the accuracy of Siri's voice recognition.
  • Apple will let users opt out of grading in a future software update.
  • Google and Amazon also analyze recordings, but have better controls for users when it comes to seeing and deleting what they've spoken in the past.
A prototype of Apple's new HomePod is displayed during the 2017 Apple Worldwide Developer Conference (WWDC) at the San Jose Convention Center on June 5, 2017 in San Jose, California.
Getty Images

Apple said it will suspend a program that allowed its staff to listen to conversations people have had with the Siri voice assistant.

Under the program, if you said, "Hey Siri, what's the weather today?" someone at Apple could listen to a recording of that question. Apple says it doesn't listen to all conversations, just a small fraction. The company also says the recordings are anonymous.

Apple calls the practice "grading." Google and Amazon do the same thing with their voice assistants, and all three companies say it's to improve speech recognition. While Apple hasn't hidden the fact that it uploads this data anonymously, The Guardian reported on July 26 that workers who listen have heard about drug deals, medical details and more.

Apple says Siri grading is anonymous, and the data sent back to Apple is encrypted and randomized, which means it would be hard to identify a person unless specific things were mentioned in a recording, like an address or a name. Apple also does most of its processing on device, only sending a fraction of requests to the servers. Amazon and Google do the processing in the cloud. Still, bad things can happen, as The Guardian noted and as other contractors who worked with Google have discovered.

In July, Google admitted that contractors leaked more than 1,000 voice recordings from Google Assistant. Voices in the clips were identifiable by what was spoken, according Belgian news site VLT, which obtained them. As a result, Google on Wednesday suspended its analysis of Google Assistant recordings in Europe.

But Apple, which has been putting a bigger focus on privacy, has been slow to add some features offered elsewhere. Amazon, for example, has had a feature that lets you opt out from a similar program it operates. Google and Amazon let you review and delete everything you've asked their voice assistants.

"User voice recordings are saved for a six-month period so that the recognition system can utilize them to better understand the user's voice," according to an Apple white paper. "After six months, another copy is saved, without its identifier, for use by Apple in improving and developing Siri for up to two years."

Apple does not let you review the questions you've asked Siri and has not yet rolled out a feature that lets you opt out of the possibility that your voice recordings will reviewed by employees. It says the latter is coming. An Apple spokesperson would not say if the company plans to let you review and delete your history of Siri questions.

The Verge has a guide showing how you can currently delete Siri recordings, but it shows just how wonky the experience to do it is and why Apple needs to make it easier.

"We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally," Apple said in a statement. "Additionally, as part of a future software update, users will have the ability to choose to participate in grading." A company spokesperson said it had no additional information when asked if it will match privacy features offered by Google and Amazon that allow you to review and delete previous questions.

What happens when you ask Alexa, Google and Siri if they are spying on you
What happens when you ask Alexa, Google and Siri if they are spying on you