Apple’s shock Siri surveillance demands a swift response

Credit to Author: Jonny Evans| Date: Mon, 29 Jul 2019 07:51:00 -0700

News that Siri records snippets of our conversations with the voice assistant isn’t new, but claims that those short recordings are listened to by human agents is– particularly in light of the company’s big push on privacy.

I’m a passionate believer in the importance of privacy.

It isn’t only important in terms of preserving hard-won liberties and protecting public discourse, it’s also of growing importance across every part of human existence, for every school, medical facility or enterprise. History shows that the absence of privacy has a corrosive effect on society, turning family members against each other and dampening innovation.

Apple’s warnings around privacy – including Apple CEO Tim Cook’s warning that many of the tech firms we work with daily are marching us into a surveillance state – are important, (if not always convenient to those who argue that privacy is just something we should sacrifice for the good of the robots that we make).

However, the revelation that Apple has not made it clear that recordings of our private conversations are being shared with “contractors” for “quality control” and “training” purposes is a really bad look for the company.

Apple states that:

“A small portion of Siri requests are analysed to improve Siri and dictation.” It also promises that “User requests are not associated with the users Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

That’s reassuring to some extent but given that in some instances Apple’s reviewers are reported to have heard people sharing personal information including their address, the move to divorce recorded sound fron the relevant Apple ID may not be enough.

Apple will need to do a little more.

Hidden in Apple’s terms and conditions you’ll find warnings that recordings made using Siri are sometimes used for quality control purposes.

You’ll also find a promise that those recordings are (unlike competitors) not linked in any way with your Apple ID or personal identity.

These promises aren’t very clear.

That’s why I suggest Apple should introduce much clearer and easier to understand privacy warnings around use of Siri on its devices. It needs to move these warnings front and center and make it crystal clear not only that it sometimes uses these recordings, but how it uses them, how it protects identity when using them and exactly how long those recordings are retained.

Apple should (I think) make Siri recording an opt-out.

That is to say when setting up Siri on a new device you as a user should be given the chance to explicitly reject use of your voice for any purpose other than the original request.

You should be able to say that you don’t want your data used in training or quality control.

“But then Apple won’t be able to train the assistant as swiftly,” some might complain. Perhaps not, but it will be maintaining leadership in privacy protection.

Not only this, but it is arguable how useful these recordings actually are.

Who are the “contractors” Apple, Google, Amazon and all the others are using who verify and listen to these short recordings of what we say?

How are they hired?

What are their job descriptions? Who else do they work for?

On what basis can they be trusted and how can individuals extract reparation in the event they abuse this trust?

And why don’t Siri users already know the answers to all these questions?

That these contractors are outsourced makes no sense.

Apple should bring this work in-house, become completely accountable for what its voice workers and management do with these recordings, and ensure customers have some way in which to punish any infraction of their data privacy.

Apple says it records and uses only small sections of words it hears, but does it even need to keep or verify as many samples as it chooses to take?

Think about it like this:

If Siri on your Watch or HomePod hears what it thinks is the ‘Hey Siri’ command, but no request is subsequently made, then surely it should be smart enough to recognize a false alarm took place?

In the event of such a false alarm, then Siri on the device should surely be smart enough to learn what caused the accidental invocation and be less sensitive to that sound in future.

(A Guardian report claimed the sound of zips sometimes wakes Siri up, for example – surely Siri can learn to ignore the sound of zips in response?).

The bottom line is that in the event Siri is invoked but no specific request is made, the system should be smart enough to ignore the interaction and delete any recording made as a result of that interaction, beyond (possibly) the first three-seconds in which it thought it heard the trigger command.

Just taking this step would pretty much prevent some of the more egregious recordings Siri is said to have heard and shared with contractors.

I don’t want Siri to get better at listening to what I say when I don’t want it to listen. I just want it to get better at listening when I do make a request.

Voice technology is advancing rapidly. This begs the question: “Is it even necessary for human contractors to listen to Siri-related conversational snippets at all?”

I think in many cases, it simply isn’t.

The recordings should be put through Siri and other voice recognition technologies in the first instance to automate the check for accuracy.

Only in those instances in which different voice recognition systems can’t find a way to agree on what is said should human ears be necessary.

I’m no AI expert, but this kind of analysis is the kind of thing Random Forest models/algorithms are built for – only when the technology can’t agree on why it got a request wrong should a human be necessary at all.

Those are just a small collection of suggestions I hope Apple takes in order to make Siri the most private voice assistant in the industry.

This stuff matters.

Think about it:

Not only are the words we use our own property, but as sensor-based technologies and AI enter different spheres of everyday life – from mapping to ubiquitous VR – the need for privacy becomes even more important, as so much more of our lives will become an open book.

The decisions we make around voice today will define every other privacy sphere.

There are some who think privacy is a price we should pay for computerized convenience, but I’ve never agreed with them.

We stand at a point in human history at which the decisions we make around voice assistant privacy will resonate in our future.

We need to sort this out in an effective way, because unless we do, then the implementations will be insecure and potentially ethically unsound.

And Apple must push forward with its demand for a digital bill of rights in this space, a bill that puts users at the center of privacy control.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

http://www.computerworld.com/category/security/index.rss

Leave a Reply