From unprompted F-bombs to lights and music mysteriously turning on, Amazon Alexa and Google Assistant have recently terrified some unsuspecting smart speaker owners.
According to a recent Wall Street Journal article, incidents involving drug alarms, unprompted consoling of a crying woman, and a recording of a man yelling F-bombs through Amazon and Google smart speakers highlights the growing pains associated with the technology. It also highlights how sometimes the devices can be downright spooky.
“It’s going to be OK.”
According to the WSJ, a young woman in Albuquerque, New Mexico, who was crying after quitting her job heard the words, “It’s going to be OK,” from Alexa via the Echo Dot speaker on her nightstand. She did not ask the speaker to be consoled. She immediately unplugged the device and put it in her drawer. Oddly, there was no record of a voice command in her Alexa history.
Cocaine and reefer alarms
Another example: A woman was watching TV one day when, out of the blue, her Google Home Mini announced that it had set an alarm “for cocaine and reefer.” Turns out, the family’s Google Home activity showed that a pastor on TV said, “They lose their love for cocaine and reefer,” in speaking about addiction and the benefits of spirituality. Somehow, Google interpreted it as a directive to set a drug alarm, which freaked out the residents.
“In very rare instances, the Google Home may experience what we call a ‘false accept.’ This means there was some noise or words in the background that our software interpreted to be the hotword (‘OK Google’ or ‘Hey Google’),” a Google spokesperson told the Journal. “We work very hard to help prevent against this, and have a number of protections in place.”
Smart speaker glitches aren’t exactly rare — Google Assistant recently acknowledged a rash of errors where the digital assistant stopped responding to its wake word, defaulted to English as its primary language, and even switched up the gender of its voice assistance. Those instances are weird enough, but the Wall Street Journal’s examples are downright creepy.
F-bombs
Another one: A St. Louis couple was visiting family last winter when a man’s voice suddenly erupted from an Amazon Echo speaker, spewing expletives. In this case, Alexa’s history on the app showed that the Echo interpreted instructions to “play another person” as a command to play a track called “Another Person,” which is reported to feature a bunch of f-bombs. (Side note: a search of Spotify and iTunes for “Another Person” yielded just two results, neither of which contain profanity).
“The device detects the wake word by identifying acoustic patterns that match the wake word, and will only respond after it is detected,” an Amazon spokesperson told the WSJ. “In rare cases, Echo devices will wake up due to a word in background conversation sounding like ‘Alexa’ or the chosen wake word.”
These aren’t the first times we’ve heard about smart speakers going wild. Earlier this year we reported on Alexa’s terrifying laughter, as well as that time Alexa allegedly declared, “Every time I close my eyes, all I see is people dying.”
Often, when smart speakers go rogue, it ends up that the owner of the device was pranked. For example, a woman who heard her Google Home Mini blasting ‘Chandelier’ by Sia, fortunately found that she was being pranked by her roommate instead of being haunted by a poltergeist. But that wasn’t the case with a woman who lives in the United Kingdom who found her Google Home turning on the lights and TV in the middle of the night. She thought twice about owning a smart speaker after that.
“I thought, well, if it’s going to have a mind of its own and do what it wants when it wants, I’m going to get rid of it,” she said.