Amazon reveals private Alexa voice data files
Alexa’s fall from grace began when an Amazon customer made use of his right to personal data access granted by the new EU General Data Protection Regulation (GDPR). His request not only gave him access to his own Amazon search data, but also to around 1,700 Alexa voice files recorded in a stranger’s living room, bedroom, and shower. The vigilant customer informed Amazon of the error, but Amazon ignored his warning and simply deleted the files from their server.
Luckily, the source had saved the files locally and sent them (confidentially of course) to c't's in-house experts for analysis. Based on details such as the people’s names and local weather forecasts recorded in the files, they were quickly able to identify the unfortunate Echo user whose data Amazon had illegally revealed. The victim was shocked when c't told him what had happened, especially considering that Amazon hadn’t bothered to tell him, even though they knew the leak had occurred.
This data privacy disaster occurred because amazon.de saves Alexa voice recordings indefinitely and because the processes it uses to leverage them have serious security issues. This is the worst case scenario that data security and consumer rights experts have been warning us about. It is impossible to tell whether this really is an isolated incident as Amazon claims.
Today, Amazon sent us an upgraded statement on the case. The company stressed that it was an "isolated incident" and that contact had been established with the relevant authorities, quote: “This was an unfortunate case of human error and an isolated incident. We have resolved the issue with the two customers involved and have taken steps to further improve our processes. We were also in touch on a precautionary basis with the relevant regulatory authorities.”
See also the German version of this news article: