To tell Amazon to stop saving the recordings of your voice interactions with Alexa: Where it says "Today," hit the drop-down menu and select "All History." Under "View, hear, and delete your voice recordings," select "Review voice recordings." Select the "Privacy Settings" tab in the top center of the page. To delete past Alexa recordings stored on the Amazon cloud: While that's often not a choice that's clearly presented to people, it doesn't mean it isn't one you have. ![]() ![]() As Apple's recent update to iOS demonstrated, when presented with the choice, very few people will opt in to surveillance. That means that unless you dig around in those devices' settings and make an affirmative choice to say "no, thank you," in the eyes of Amazon you've effectively said "yes, please." Credit: Chloe Collyer / gettyĪmazon's Echo and other Alexa-enabled devices hoover up your personal information by default. Thankfully, there's something you can do about it. In other words, Amazon Echo devices pose a potential privacy threat. You can ask Amazon to delete those records, but even if you do, the company keeps a copy of the written transcript for 30 days. This is all in addition to the fact that your recordings are kept on Amazon's servers for later reference. Once you have someone's location data, it's pretty easy to figure out their real name. To make matters worse, Bloomberg later reported that some Amazon employees listening to and transcribing Alexa recordings could see where those customers lived. Those false wake recordings included what they thought to be a recording of sexual assault as well as banking details. One of those reviewers told the publication that, in addition to their other work, those contractors each transcribed around 100 recordings each day that appeared to be the result of false wakes. In 2019, Bloomberg reported on a group of contractors who had this very job. That's because Amazon pays people to listen to and transcribe a subset of Alexa requests with the stated goal of improving the service. Those chats might be innocuous things like asking for the weather forecast, yes, but also potentially private information like asking for directions to the nearest Alcoholics Anonymous. In these disturbing situations, complete strangers can end up with audio recordings of your Alexa chats. "In some cases, your Alexa-enabled device might interpret another word or sound as the wake word (for instance, the name 'Alex' or someone saying 'Alexa' on the radio or television)," explains the company. It happens so often, in fact, that Amazon has its own term for the privacy-invading habit: "false wakes." Alexa has been known to record people and rooms even when there's no wake word spoken intentionally - or spoken at all. Or at least that's how it's supposed to work. ![]() In some instances, real humans listen to and transcribe those recordings with the goal of improving Amazon's voice-recognition software. Unless you take the time to dig through your settings and actively opt out, your Alexa-enabled device records and stores your questions and conversations whenever it hears a so-called wake word like "Alexa." What Amazon does with your voice recordingsĪlways listening. Thankfully, there's something you can do about it that doesn't involve taking a hammer to your smart assistant (though, if you do go that route, please recycle the smashed Echo afterward). What truth do you let out when you believe no one is watching? Put another way, your personal questions, doubts, and fears spoken aloud as if no one was listening may have found themselves in the hands of a group of people paid to do exactly that. As with most forms of modern "smart AI," Alexa depends on real humans listening in on a share of conversations and transcribing those requests.Īmazon calls this "supervised machine learning," and rather blandly describes strangers being paid to creep on its customers as "an industry-standard practice where humans review an extremely small sample of requests to help Alexa understand the correct interpretation of a request and provide the appropriate response in the future." But as with most magic tricks, when it comes to Alexa, it's worth paying attention to just who, exactly, is behind the curtain.īecause, despite what many people may assume, with Alexa-enabled devices like the Echo, there is very much someone behind the curtain. By merely speaking it into the universe, users can conjure up-to-the minute weather reports from far-off lands, summon physical goods to be same-day rushed to their doors, and even get medical advice. Privacy Please is an ongoing series exploring the ways privacy is violated in the modern world, and what can be done about it.Īmazon's Alexa can feel like a form of magic.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |