Amazon has been forced to publicly explain how one of its Alexa-capable products ended up recording a private conversation between husband and wife and then sent that conversation to an employee of the husband.
If the explanation is to be believed, then it’s a case of a series of unfortunate events rather than being anything actively malicious or invasive.
According to an interview with Portland-based TV station Kiro 7, Alexa in an Amazon Echo device was able to record a conversation between a woman known as Danielle and her husband. Then, as if that recording wasn’t bad enough, the hardware then forwarded the full private conversation directly to an employee of Danielle’s husband, meaning that the individual was able to listen to the full conversation.
Amazon has confirmed that this incident did actually take place as stipulated in the TV interview but has clarified that the problem occurred due to Alexa mistakenly thinking it heard a series of commands:
Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
So, that’s the explanation by Amazon, as issued to The Verge. It’s great to read that Amazon is extremely aware that it all sounds entirely non-plausible that so many interpretations could be made that would trigger this series of events to occur. At best, the whole incident shows that Amazon – as well as other creators of digital assistants – still has a lot of work to do to ensure that this type of event doesn’t happen again. At worst, it highlights just how invasive digital assistants could be in our homes if they were indeed programmed to perform in a malicious way.
Still, if you actually have any Amazon hardware which runs Alexa, then you yourself will likely know that it’s prone to spontaneous acts of coming to life based on what it perceives to be keywords.
You may also like to check out:
Like this post on Facebook