Amazon sends 1700 archived Alexa audio files to the wrong customer

FILE PHOTO: Prompts on how to use Amazon's Alexa personal assistant are seen in an Amazon ‘experience centre’ in Vallejo
FILE PHOTO: Prompts on how to use Amazon’s Alexa personal assistant are seen in an Amazon ‘experience centre’ in Vallejo, California, U.S., May 8, 2018. Picture taken May 8, 2018. REUTERS/Elijah Nouvelage/File Photo

21 December 2018 | Emma Sims | Alphr

An Amazon customer in Germany was surprised to receive 1,700 audio files from a stranger’s Alexa. The man had requested a copy of his archived data from the internet giant, only to be dumbfounded when he received over one thousand copies of someone else’s private conversations.

The customer had requested to review his data under the European Union’s new General Data Protection Regulation (GDPR) laws. Amazon complied – or so it thought – sending the man a download link to what they believed to be his data. The Alexa user was surprised to open some 1,7000 recordings from a stranger’s household.

Speaking to German trade magazine c’t, the customer revealed his shock. “I was very surprised about that because I don’t use Amazon Alexa, let alone have an Alexa-enabled device,” he said. “So I randomly listened to some of these audio files and could not recognise any of the voices.”

The contents of the recordings were predictably – understandably – intimate; the innocuous sounds of domesticity, showering, weather inquiries and music requests, revealing a lot about the user’s identity.

Amazon Echo tries to order dollhouses across San Diego

When San Diego’s XETV-TDT covered the story last Thursday, the local news anchor Jim Patton used the phrase “I love the little girl, saying ‘Alexa ordered me a dollhouse,’” and the whole cycle started over again. The Amazon Echos in people’s homes listening in to the news report quietly got to work and started ordering dollhouses “all over San Diego”.

As for Amazon, it was quick to downplay the incident. Speaking to Reuters on Thursday, a company spokesperson said, “This unfortunate case was the result of a human error and an isolated single case.”

As for damage control, it was swift and comprehensive. “We resolved the issue with the two customers involved and took measures to further optimise our processes,” said the spokesperson. A pre-emptive own goal was also conceded: “As a precautionary measure we contacted the relevant authorities,” the spokesperson continued.

This isn’t the first time that commands to an Alexa have gone awry, ending up where they shouldn’t have. Last year saw a six-year-old order a $170 dollhouse without her parents’ consent.

When the story made local TV news, the command – heard by Alexa devices across San Diego – was duly registered by those devices, with Amazon Echos citywide attempting to order replicas.

Original Link: Amazon sent customer 1,700 audio files from a stranger’s Alexa


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s