Bizarre case sees Alexa record and share a family’s private conversation, Amazon admits

A family in Portland, Oregon used to tease that Amazon’s Alexa was listening in on their conversations. That joke stopped being so funny when a seemingly random person called out of the blue, warning them to unplug all of their smart home devices.

Bizarre case sees Alexa record and share a family’s private conversation, Amazon admits

“You’ve been hacked,” the person warned, according to a report from KIRO7. Danielle, who didn’t want to give her surname to the station, said the voice on the other end of the line belonged to an employee of her husband, calling from Seattle with alarming news.

“He proceeded to tell us that he had received audio files of recordings from inside our house,” said Danielle. “At first, my husband was, like, ‘no you didn’t!’ And the [recipient of the message] said ‘You sat there talking about hardwood floors.’ And we said, ‘oh gosh, you really did hear us.'”

The acquaintance played back the recorded conversations, which were just as he’d described, and Danielle came to the realisation that their conversation had been recorded and sent to someone 176 miles away without her knowledge.

“I felt invaded,” she told KIRO7. “A total privacy invasion. Immediately I said, ‘I’m never plugging that device in again, because I can’t trust it.'”

READ NEXT: Homicide in the age of Amazon

Danielle had installed Amazon Echo devices in every room of her home, using them to control everything from lights to a security system. After the unexpected phone call, she called Amazon. The company sent engineers to investigate, who verified that Alexa had indeed recorded the family’s conversation.   

On Thursday, Amazon put out a statement explaining what it believes happened in Danielle’s household:

“Echo woke up due to a word in background conversation sounding like ‘Alexa’. Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’.

“As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

Unlikely indeed, although it wouldn’t be the first time Alexa has mistaken background noise for instructions. Earlier this year, a number of Amazon Echo owners reported that their AI assistants had developed a habit of randomly chuckling to themselves. Amazon similarly pinned the unnerving phenomenon on “false positives”, with the Echo mishearing requests for Alexa to laugh on command.  

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.

Todays Highlights
How to See Google Search History
how to download photos from google photos