Skip to main content

Apple contractors will no longer be able to listen to your Siri recordings

Type to Siri
Image used with permission by copyright holder

Last week, The Guardian revealed that Apple uses human contractors to review Siri recordings — and that the contractors often end up hearing everything from drug deals to sexual encounters as a result.

Apple said Friday that it has suspended the use of those contractors while it reviews the process. It will allow users to opt out of having their Siri discussions being reviewed by a human contractor in the future. That ability will come in a future software update.

Recommended Videos

In an email to The Washington Post, an Apple spokesperson says that Apple was committed to user privacy. Following Apple’s announcement, Google said it was also going to temporarily suspend the use of humans to review “OK Google” voice assistant recordings. We’ve reached out to both Google and Apple for more details and will update this story if we hear back. 

Please enable Javascript to view this content

The human contractors reviewed conversations with Siri in order to improve the virtual assistant. Contractors would listen to recordings when Siri was triggered but did not provide an answer in order to determine if she should have been able to in that instance.

The goal behind the human listeners was to better understand situations where Siri might be failing so that she could be improved going forward. They would listen to a recording, determine if the user meant to activate Siri, and grade her response (or lack thereof).

Image used with permission by copyright holder

That human element, however, wasn’t made clear to users.

A whistleblower who worked as a Siri reviewer told The Guardian that the contractors would often be able to hear things like sensitive medical information, and that recordings were accompanied by user data showing users’ location and some personal details.

Less than 1% of daily Siri activations were sent to human contractors for evaluation.

 “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple said in a statement to The Guardian when the issue was originally revealed. 

Apple isn’t the first company to run into this issue. Earlier this year, Amazon came under fire for similar issues with its personal assistant, Alexa. Much like Apple, Amazon uses humans to analyze recordings of Alexa’s answers. It also retains text data of requests, even when a user deletes a recording.

Amazon now offers an option within Alexa’s settings where customers can opt out of their data being used to improve the service.

Emily Price
Former Digital Trends Contributor
Emily is a freelance writer based in San Francisco. Her book "Productivity Hacks: 500+ Easy Ways to Accomplish More at…
In 2023, it’s time to finally ditch your real wallet for Apple Pay
Front of Apple Card

In the early 2000s, the U.S. started accepting contactless payments at credit card terminals, which used near-field communication (NFC) technology to make it happen. However, it was still too early and not widely adopted until 2008, when the major credit card companies began to offer contactless credit cards.

But contactless payments continued to evolve. Soon enough, Apple added Apple Pay in 2014, allowing you to add your credit and debit cards in the digital Wallet app and pay with your phone. There is also Google Pay for Android devices, and even Samsung has its own version of mobile payments called Samsung Pay.

Read more
Hey Siri, let me just say Siri
Siri in action on an iPhone.

Most folks with an Apple device know that the easiest way to access its digital assistant is to call out “Hey, Siri.”

But it now appears that Apple is aiming to simplify the wake phrase by eliminating the word “Hey,” and going only with “Siri.”

Read more
Experts warn AI assistants are hurting the social development of children
Wear OS - Google Asssitant

The likes of Google Assistant and Alexa have been at the receiving end of privacy-related concerns for a while now, yet they continue to make inroads inside millions of homes. But it appears that they might also have a detrimental impact on the growth of children when it comes to their psycho-social development and acquiring core skills.

According to an analysis by experts from the University of Cambridge’s School of Clinical Medicine, interaction with AI assistants affects children in three ways. Starting at the bottom of the chain is the hindrance posed to learning opportunities.

Read more