Last week, The Guardian revealed that Apple uses human contractors to review Siri recordings — and that the contractors often end up hearing everything from drug deals to sexual encounters as a result.
Apple said Friday that it has suspended the use of those contractors while it reviews the process. It will allow users to opt out of having their Siri discussions being reviewed by a human contractor in the future. That ability will come in a future software update.
In an email to The Washington Post, an Apple spokesperson says that Apple was committed to user privacy. Following Apple’s announcement, Google said it was also going to temporarily suspend the use of humans to review “OK Google” voice assistant recordings. We’ve reached out to both Google and Apple for more details and will update this story if we hear back.
The human contractors reviewed conversations with Siri in order to improve the virtual assistant. Contractors would listen to recordings when Siri was triggered but did not provide an answer in order to determine if she should have been able to in that instance.
The goal behind the human listeners was to better understand situations where Siri might be failing so that she could be improved going forward. They would listen to a recording, determine if the user meant to activate Siri, and grade her response (or lack thereof).
That human element, however, wasn’t made clear to users.
A whistleblower who worked as a Siri reviewer told The Guardian that the contractors would often be able to hear things like sensitive medical information, and that recordings were accompanied by user data showing users’ location and some personal details.
Less than 1% of daily Siri activations were sent to human contractors for evaluation.
“User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple said in a statement to The Guardian when the issue was originally revealed.
Apple isn’t the first company to run into this issue. Earlier this year, Amazon came under fire for similar issues with its personal assistant, Alexa. Much like Apple, Amazon uses humans to analyze recordings of Alexa’s answers. It also retains text data of requests, even when a user deletes a recording.
Amazon now offers an option within Alexa’s settings where customers can opt out of their data being used to improve the service.