Top Stories
Top Stories
Tech

Apple apologizes for listening to Siri conversations

Key Points
  • Apple apologizes for a Siri grading program.
  • The company had been allowing contractors to listen to a small percentage of the things people spoke to Siri.
  • It suspended the program but will relaunch it this fall, letting users opt in if they want to help Apple improve Siri.
VIDEO1:0601:06
Apple apologizes, will no longer store Siri recordings

Apple on Wednesday apologized for its "Siri grading program," which allowed contractors to review a small percentage of the things people spoke to its Siri voice assistant. The company said it will enable a few changes that give users more control over how their Siri requests are handled.

The program was halted earlier this month after The Guardian reported on July 26 that some of the workers who were reviewing Siri requests heard personal medical details, drug deals and more. Apple does most of its Siri processing on the device, however, instead of sending it to the cloud as Amazon and Google do.

Still, there's no way to find out if you might have been among the small percentage of people whose questions to Siri were heard by people working for Apple. Apple also doesn't let you review the questions you've asked Siri, a feature that both Amazon Alexa and Google Assistant offer.

In a post to its site, Apple said that, by default, it will "no longer retain audio recordings of Siri interactions" but that it will still use "computer-generated transcripts to help Siri improve." Users will be able to opt in to help Apple improve Siri, and those who do will also be able to opt out whenever they want to.

Apple also said that only its own employees, not outside contractors, "will be allowed to listen to audio samples of the Siri interactions," and that the team will "delete any recording which is determined to be an inadvertent trigger of Siri." According to a report from The Guardian on Wednesday, Apple laid off more than 300 contractors who were working on Siri grading in Europe.

"These transcriptions are associated with a random identifier, not your Apple ID, for up to six months," according to a new Siri Privacy and Grading page that Apple published on Wednesday. "If you do not want transcriptions of your Siri audio recordings to be retained, you can disable Siri and Dictation in Settings."

The page also explained that Apple's grading process reviewed less than 0.2% of Siri requests. It used grading to "measure how well Siri was responding and to improve its reliability."

"For example, did the user intend to wake Siri? Did Siri hear the request accurately? And did Siri respond appropriately to the request? By using grading across a small sample of Siri requests over time, Apple can make big improvements that help ensure that our customers around the world have the best Siri experience possible," Apple's new Siri privacy page says.

Amazon also grades how well Alexa performs, but lets users opt out of the program, which is enabled by default. Google suspended a similar practice in Europe earlier this month. In July, Google admitted that contractors leaked more than 1,000 voice recordings from Google Assistant, and voices in the clips were identifiable by what was spoken, according to Belgian news site VLT.

VIDEO0:5300:53
What happens when you ask Alexa, Google and Siri if they are spying on you


Follow @CNBCtech on Twitter for the latest tech product news.