Two weeks ago, a woman in Portland learned that her Amazon Echo device had recorded and sent a private conversation between her and her husband to one of his employees in Seattle without their knowledge. “Unplug your Alexa devices right now,” she says the employee told them, “You’re being hacked.”
But the couple in Portland wasn’t being hacked. According to Amazon, Alexa misunderstood their discussion about hardwood floors as a series of commands to send their conversation to the man in Seattle. In a statement to Gizmodo, the online shopping giant confirmed that the Echo device “woke up due to a word in background conversation sounding like ‘Alexa.’” The statement continued:
Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right.” As unlikely as this string of events is, we are evaluating options to make this case even less likely.
Amazon told Ars Techinca that the above sequence was sourced from device logs, but speaking to KIRO-7, the woman in Portland claimed the Echo never audibly confirmed it was going to send the recording. Either way, it’s disturbing to imagine Alexa mishearing a conversation about flooring as not just one, but four consecutive commands. Amazon characterized the incident as “an extremely rare occurrence,” in a statement to the news station.
Ultimately, the most unsettling part of the incident might be how plausible Amazon’s explanation is. If you’re not paying attention, Alexa could turn an innocuous chat in your home into a major invasion of privacy. And while this particular screw-up is allegedly rare, it’s easy to picture far more dangerous situations where someone isn’t just talking about their floors. In the end, we all have to ask ourselves if a future of mild convenience outweighs the cost of troubling surveillance mishaps like this.