Amazon

Amazon’s Alexa Can Accidentally Record and Share Your Conversations

Amazon insists it is “extremely rare.” But it almost happened to me.
Image may contain Human Person Sitting Studio and Hinnerk Schönemann
© 20th Century Fox Film Corp./Everett Collection.

This weekend, while at home watching a particularly tense scene in the Showtime series Billions, my Amazon Echo suddenly piped up. “To whom?” Alexa asked, pausing for a few seconds before asking again, “To whom?” There is no character on Billions named Alexa, or even Alex (the closest, perhaps, is Lara Axelrod, and that seems like a stretch), and thus there was seemingly no way for Alexa to have been triggered. And yet, she was. As any logical human would, I unplugged the device immediately.

By now, Amazon’s Echo devices are somewhat famous for their eerie glitches—some, like unsolicited laughter, are benign, while others, like responding to undetectable commands embedded in podcasts or songs, are decidedly more sinister. And on Thursday, Washington state news outlet KIRO 7 reported yet another instance of an Alexa-powered Echo device acting of its own accord. According to KIRO 7, one family in Portland, Oregon, received a bizarre phone call two weeks ago. “Unplug your Alexa devices right now,” the caller told them. “You’re being hacked.” What had happened, apparently, was that the family’s Echo devices had quietly sent recordings of a mundane, private conversation to someone in the family’s contact list—in this case, one of the husband’s employees.

“My husband and I would joke and say, ‘I’d bet these devices are listening to what we're saying,’” the woman, Danielle, told the outlet. “We unplugged all of them, and he proceeded to tell us that he had received audio files of recordings from inside our house. At first, my husband was like, ‘No, you didn’t!’ And the [caller] said, ‘You sat there talking about hardwood floors.’ And we said, ‘Oh, gosh, you really did hear us.’”

Danielle’s family had wired their whole home with Amazon devices, which they used to control their security system, heat, and lights. But after the recording incident, Danielle unplugged the devices and called Amazon. An Alexa engineer investigated. “They said, ‘Our engineers went through your logs, and they saw exactly what you told us, they saw exactly what you said happened, and we’re sorry . . . This is something we need to fix,’” she said. Still, it remains unclear why the device wouldn’t have tipped Danielle off to the fact that it was sending the audio recordings, something it is programmed to do. In a statement to KIRO 7, Amazon said it “takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future.”

If Danielle’s story is unnerving, Amazon’s statement is even more so—“extremely rare” implies the same phenomenon has occurred on at least one separate occasion, if not more. And while this particular misfire was benign, there’s always the chance that a conversation will veer away from flooring, or be sent to a contact who’s less forgiving, even malicious. Echo devices are only supposed to record audio when a user gives a voice command, known as a “wake word.” But last month, researchers discovered a flaw in Alexa that allowed the device to continue listening indefinitely. “As far as we could tell, there was no limit,” Amit Ashbel, the director of product marketing for the research firm, told CNET. “As long as you didn’t tell it to stop, it wouldn’t.” Amazon engineers professed to have fixed the problem shortly after it was discovered, saying in a statement, “Customer trust is important to us and we take security and privacy seriously.” In an interview last month, Al Lindsay, the vice president of Alexa Engine software at Amazon, echoed the company line: “The whole product and the whole Alexa experience is designed around being thoughtful about privacy and customers’ concerns about that,” he said. Yet when asked to describe a “valid” privacy concern about voice A.I. software, he demurred. “I don’t really have one for a response for that,” he said. “I feel there isn’t really anything that falls into that category, that I’m aware of, or focused on.”

It’s possible that I could have met the same fate as the Portland family. “To whom?” is what the Amazon Echo asks before sending a recorded voice message. Amazon gave this statement to Recode as further explanation for how snippets of the Portland family’s private conversation got sent to a contact in their address book:

Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

My Amazon Echo will remain unplugged until further notice.