By law, the European Union’s General Data Protection Regulation, or GDPR, which went into effect last year, requires (among other things) that consumers be able to access their stored data at any time. None of the recordings are stored on Alexa devices.Īmazon doesn’t delete stored recordings either - only users can do that. Alexa then sends your recorded voice data command over the internet to Amazon’s computers which, in turn, allow Alexa to respond to your commands.Īmazon stores each voice command recording in the cloud. Once the device hears the wake word and wakes up, it also lights up to visually confirm that it’s recording and, on some devices, it makes a chime sound, too. Google”, and for Apple devices it’s “Hey Siri”.Īlexa devices have seven microphones with noise-canceling technology that constantly record and replace one-second snippets of ambient sound while waiting for the wake word. For Alexa devices, the default wake word is “Alexa”, for Google devices the wake word is “O.K. What Alexa records and where it stores your recordingsĪlexa and other “always-on” devices listen constantly for someone to say the wake word. Some of them were surely Alexa voice-recording requests.Īll the more reason to understand what Alexa is recording and how you can manage it. It responded either fully or partially to 70% -77% of the requests. In the first half of 2018 - the most recent report available as of this writing - Amazon received a total of 1,736 subpoenas, 344 search warrants, and 162 other court orders. Amazon releases a bi-annual transparency report that lists the number of legal requests it receives and how it responds to them but it doesn’t break out the Alexa data. The total number of legal system requests, however, is unknown. The same kind of human errors that are made in everyday life are made by Amazon employees who manage your data.Īnd, at least a couple of times, Alexa voice recordings have been requested as evidence in murder cases. And so are other privacy concerns.įor example, Amazon can send someone the wrong Alexa voice recording history. While these examples didn’t have serious consequences, the potential is there. Thermostats far and wide were adjusted to 70 degrees. When people had the story playing within earshot of an Alexa device, Alexa dutifully woke up and followed the commands featured in the NPR report. For example, NPR did a news story on Alexa. Other times Alexa does exactly what it’s supposed to - respond to a command that follows the wake word - but it turns out to be a misadventure anyway. And still another time, coders wanted to test the security of the Alexa Skills (apps) system by turning a Skill into an eavesdropping tool and it worked (Amazon has since fixed the glitch). Fortunately, there wasn’t anything negative or embarrassing in those conversations but the Alexa owners were certainly shocked.Īnother time a 6-year-old ordered a dollhouse and some fancy cookies from Amazon - her parents had no idea until the packages were delivered. Or when it happened to a man in North Carolina whose conversation was sent to his insurance agent. Or the story about the same thing happening to a family in Germany with the recording being sent to one of the husband’s business contacts. You may have heard the story about the time Alexa sent a recording of a private conversation between a married couple in Portland to one of the husband’s employees. Here’s how: Misadventures in Alexa privacy - have you heard about these? By understanding how voice recordings are set up, stored, and shared, you can avoid your own Alexa misadventures. Maybe - if you make it easy for them! You’ve probably heard stories about misadventures with Amazon Alexa devices. Can Alexa and other always-on, Internet-connected assistants trample all over your privacy and embarrass you or worse?
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |