Tech

Apple’s shock Siri surveillance demands a swift response

5.91Kviews

News that Siri records snippets of our conversations with the voice assistant isn’t new, but claims that those short recordings are listened to by human agents is — particularly in light of the company’s big push on privacy.

These are bad optics for Apple

I’m a passionate believer in the importance of privacy.

It isn’t only important in terms of preserving hard-won liberties and protecting public discourse; it’s also of growing importance across every part of human existence — for every school, medical facility, or enterprise. History shows that the absence of privacy has a corrosive effect on society, turning family members against each other and dampening innovation.

Apple’s warnings around privacy — including Apple CEO Tim Cook’s warning that many of the tech firms we work with daily are marching us into a surveillance state — are important (if not always convenient to those who argue that privacy is just something we should sacrifice for the good of the robots that we make).

However, the revelation that Apple has not made it clear that recordings of our private conversations are being shared with “contractors” for “quality control” and “training” purposes is a really bad look for the company.

What Apple is saying

Apple states that:

“A small portion of Siri requests are analysed to improve Siri and dictation.” It also promises that “User requests are not associated with the users Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

That’s reassuring to some extent, but given that in some instances, Apple’s reviewers are reported to have heard people sharing personal information, including their address, the move to divorce recorded sound from the relevant Apple ID may not be enough.

Apple will need to do a little more.

How Apple can remedy the situation

Start with transparency

Hidden in Apple’s terms and conditions you’ll find warnings that recordings made using Siri are sometimes used for quality control purposes.

You’ll also find a promise that those recordings are (unlike competitors) not linked in any way with your Apple ID or personal identity.

These promises aren’t very clear.

That’s why I say Apple should introduce much clearer and easier-to-understand privacy warnings around use of Siri on its devices. It needs to move these warnings front and center and make it crystal clear that not only does it sometimes use these recordings, but also how it uses them, how it protects identity when using them, and exactly how long those recordings are retained.

But give people control

Apple should (I think) make Siri recording an opt-out.

That is to say when setting up Siri on a new device, the user should be given the chance to explicitly reject use of their voice for any purpose other than the original request.

You should be able to say that you don’t want your data used in training or quality control.

“But then Apple won’t be able to train the assistant as swiftly,” some might complain. Perhaps not, but it will be maintaining leadership in privacy protection.

Not only that, but it is arguable how useful these recordings actually are.

Become more accountable

Who are the “contractors” Apple, Google, Amazon, and all the others are using who verify and listen to these short recordings of what we say?

How are they hired?

What are their job descriptions? Who else do they work for?

On what basis can they be trusted and how can individuals extract reparation in the event they abuse this trust?

And why don’t Siri users already know the answers to all those questions?

That these contractors are outsourced makes no sense.

Apple should bring this work in-house, become completely accountable for what its voice workers and management do with these recordings, and ensure customers have some way to punish any infraction of their data privacy.

Make really certain you need the information

Apple says it records and uses only small sections of words it hears, but does it even need to keep or verify as many samples as it chooses to take?

Think about it like this:

If Siri on your Watch or HomePod hears what it thinks is the “Hey, Siri” command, but no request is subsequently made, then surely it should be smart enough to recognize a false alarm took place.

In the event of such a false alarm, then Siri on the device should surely be smart enough to learn what caused the accidental invocation and be less sensitive to that sound in future.

(A Guardian report claimed the sound of zips sometimes wakes Siri up, for example – surely Siri can learn to ignore the sound of zips in response.)

The bottom line is that in the event Siri is invoked but no specific request is made, the system should be smart enough to ignore the interaction and delete any recording made as a result of that interaction, beyond (possibly) the first three seconds in which it thought it heard the trigger command.

Just taking this step would pretty much prevent some of the more egregious recordings Siri is said to have heard and shared with contractors.

I don’t want Siri to get better at listening to what I say when I don’t want it to listen. I just want it to get better at listening when I do make a request.

Are humans even necessary?

Voice technology is advancing rapidly. This begs the question: “Is it even necessary for human contractors to listen to Siri-related conversational snippets at all?”

I think in many cases, it simply isn’t.

The recordings should be put through Siri and other voice recognition technologies in the first instance to automate the check for accuracy.

Only in those instances when different voice recognition systems can’t find a way to agree on what is said should human ears be necessary.

I’m no AI expert, but this kind of analysis is the kind of thing Random Forest models/algorithms are built for – only when the technology can’t agree on why it got a request wrong should a human be necessary at all.

Why it matters

Those are just a small collection of suggestions I hope Apple takes in order to make Siri the most private voice assistant in the industry.

This stuff matters.

Think about it:

Not only are the words we use our own property, but as sensor-based technologies and AI enter different spheres of everyday life – from mapping to ubiquitous VR – the need for privacy becomes even more important, as so much more of our lives will become an open book.

The decisions we make around voice today will define every other privacy sphere.

There are some who think privacy is a price we should pay for computerized convenience, but I’ve never agreed with them.

We stand at a point in human history at which the decisions we make around voice assistant privacy will resonate in our future.

We need to sort this out in an effective way because unless we do, then the implementations will be insecure and potentially ethically unsound.

And Apple must push forward with its demand for a digital bill of rights in this space, a bill that puts users at the center of privacy control.

Krist S
Very eager to view the world in my own perspective.
x