How to Keep Your Siri, Alexa, and Google Assistant Voice Recordings Private
Credit to Author: Lily Hay Newman| Date: Tue, 29 Oct 2019 16:59:17 +0000
Alexa, Siri, and Google Assistant now all give you ways to opt out of human transcription of your voice snippets. Do it.
After months of revelations and apologies, all the major smart assistant makers have revamped how they handle human review of audio snippets. Amazon Alexa, Google Assistant, Apple Siri, and Microsoft Cortana were all using third-party contractors to transcribe and vet recorded snippets, adding some human brain power to underlying machine-learning algorithms. But the backlash over the lack of transparency spurred new customer controls. And with the release of Apple's iOS 13.2 on Monday, you now have one more way to rein that data collection in.
Even if Siri isn't your smart assistant of choice, it's still a good time to take stock of how you have things set up on whatever platform you use. Each service has its own mix of options and controls. Here's how to take the human element out of Siri, Alexa, Google Assistant, and Cortana. Once that's done, tell a friend to do the same.
Apple paused human review of Siri user audio snippets at the beginning of August and published an apology later that month for the lack of transparency. Now, almost three months later, human review resumes, but it's opt-in and only with Apple employees rather than contractors.
With iOS 13.2, your Siri snippets no longer get passed along for human review by default. Now Apple asks you whether you'd like to opt in during the iOS 13.2 setup flow. If you make a mistake and opt in when you didn't mean to, or change your mind down the road, you can go to Settings > Privacy > Analytics & Improvements > Improve Siri & Dictation and toggle the switch off.
Crucially, even if you enable this data sharing, you can now delete all the audio Apple has collected from you at any time. To do so go to Settings > Siri & Search > Siri & Dictation History > Delete Siri & Dictation History to wipe everything out.
Following significant blowback, Amazon was the first smart assistant company to centralize and expand its user controls for voice recording retention. You can review the voice recordings Amazon has stored for your account by going to Settings > Alexa Privacy in the Alexa app or through Amazon's website. There you can delete entries one by one, by date range, by device, or en masse. You can also delete recordings by device on the Manage Your Content and Devices page. When you're there, you can also enable voice deletion directly, creating a clean slate by saying "Alexa, delete what I just said," or "Alexa, delete everything I said today." To turn that on in the Alexa app or Amazon's website go to Settings > Alexa Privacy > Review Voice History.
To opt out of sending your Alexa recordings for human review, go to Alexa Account in the Alexa app then Alexa Privacy > Manage how your data improves Alexa and turn off Help Develop New Features and Use Messages to Improve Transcriptions.
Keep in mind that these settings control only what Amazon retains, and don't necessarily apply to third-party developers that may have collected your voice data through Alexa Skills.
Google offers a number of ways to stop audio snippet retention or delete recordings. This page lays out the different flows for deleting or opting out in a desktop browser, on Android, or on iOS. To delete recording on desktop, open your Google account and choose Data & personalization in the left navigation panel. There, under Activity controls, choose Web & App Activity and then Manage Activity. Here you can scroll through the list of entries—those with an audio icon next to them include a recording and you can delete individual items one at a time. Or on this same page click the More hamburger menu in the upper right, choose Delete activity by and under Delete by date select All time. Then at the bottom choose Delete.
To opt out of letting Google collect recordings in the first place—which also means no human transcription—open your Google account and choose Data & personalization in the left navigation panel. There, under Activity controls, choose Web & App Activity and then make sure the box next to Include voice and audio recordings is unchecked.
Unlike the other big smart assistant developers, Microsoft simply updated its Cortana privacy policy in August to further clarify that audio snippets may be transcribed and evaluated by human reviewers—both Microsoft employees and contractors. The company never paused review or barred third parties from accessing Cortana user data. As with its peers, Microsoft says that the data it does collect is anonymized. To manage or delete your audio recordings of interactions with Cortana, make sure you're logged into your Microsoft account and then go to Microsoft's Privacy Dashboard.
Regardless of which platform you use, keep in mind that these expanded controls, while positive and necessary, don't change the fundamental concept of smart assistants. These services run on devices that contain a microphone, and can be woken up to "hear" things you're saying and process them on a faraway server. As with any internet-enabled technology—but particularly one that involves a potentially live mic—there are always going to be privacy considerations no matter how much control you have. Even Rick Osterloh, Google's senior vice president of devices and services, warns houseguests that he has a Google Home when they come in.
If these devices are a helpful and delightful force in your life, that's fine! Just take steps to protect your privacy and be like Rick: Always remember that a gadget might be listening.