Skip to main content

Apple contractors listening to Siri requests hear sex, drug deals, and more

A Human Might Be Listening To Your Siri Requests

Apple contractors routinely hear sensitive things like confidential medical information, couples having sex, and drug deals as part of their work related to quality control for the company’s virtual assistant Siri, The Guardian reports.

Recommended Videos

The recordings are passed on to contractors who are asked to determine whether the activation of Siri was intentional or accidental and to grade Siri’s responses.

Less than 1% of daily Siri activations are sent on to a human for grading. However, Apple does not expressly tell customers that their recordings might be used in this way. The issue was brought to light by an anonymous whistleblower who spoke to The Guardian. That individual said that the recordings often contain sexual encounters as well as business dealings and that they feel Apple should expressly tell users that Siri content might be reviewed by a human.

“A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple told the Guardian in a statement. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” 

We reached out to Apple to additional details but have yet to receive a response. We’ll update this story if we hear back. Siri can sometimes turn on and start listening to you if it thinks it has accidentally heard a wake word — typically “Hey Siri!” or something similar — even if you didn’t mean to turn it on.

The human beings who listen to these conversations (or worse) work to determine what the person who was recorded was asking for and if Siri provided it. If not, they determine whether Siri should have realistically been able to answer your question.

If the complaints about Apple sound familiar, it’s likely because Amazon battled a similar issue earlier this year. While Amazon also sends recordings to humans to analyze later and retains text data of requests even when recordings are deleted, the company also offers an option within Alexa’s settings where customers can opt-out of their data being used for that purpose.

Apple does not currently offer an opt-out option for Siri.

Emily Price
Former Digital Trends Contributor
Emily is a freelance writer based in San Francisco. Her book "Productivity Hacks: 500+ Easy Ways to Accomplish More at…
Cortana vs. Siri vs. Google Assistant vs. Alexa

Smart assistants might be the tip of the iceberg in taking away our jobs in the future, but that doesn't mean we can't make use of them in the meantime. Virtual assistants like Cortana, Siri, Google Assistant, and Alexa are now a signature feature of smartphones and tablets, so knowing which one is best could have a significant impact on what your next hardware purchase might be.

To see how these different smart assistants measure up, we pitted Cortana versus Siri versus Google Assistant versus Alexa in a virtual assistant head-to-head.

Read more
How to delete Siri recordings from Apple servers
how to delete siri recordings from apple servers feat image

It was a big scandal for Apple that shocked and enraged its customers: The idea that Apple used human contractors to listen in on Siri voice recordings as part of a response grading program. This was not something people expected from a company that was so ideologically committed to privacy. But, alas, it was true.

Last year, Apple had instituted a protocol designed to measure and grade the performance of its digital assistant in comprehending real-life voice questions, commands, and human interactions. Those recordings -- made with neither customer knowledge nor consent -- hoovered in an astounding range of interactions, including private discussions, personal and medical information, crimes, and sex acts, alongside data such as location, contact details, and app information. These anonymized recordings, which never specifically identified individuals and accounted for less than 1% of all Siri interactions, nonetheless left many feeling exposed, betrayed, frightened, and appalled.

Read more
Siri and Google Assistant say they now support Black Lives Matter
apple siri whispering

Voice assistants now explain and show their support for the Black Lives Matter movement when you ask questions like “Do Black lives matter?” or “Do all lives matter?”

Apple's Siri and Google's Assistant both share support for the movement, according to CNBC.

Read more