Siri Privacy Upheaval: Apple Apologizes With New Audio Policy

Apple is overhauling how it deals with Siri recordings, pledging to do away with human assessment of Siri requests after users voiced privacy concerns that their conversations with the assistant could be overheard for quality testing. The iOS and macOS assistant has been caught up in ongoing controversies around all of the major virtual assistants the past few weeks, with privacy advocates warning that human assessment and intervention is far more prevalent than many users might assume.

Advertisement

That includes sections of audio being reviewed by human operators, who are checking to see how well Alexa, the Google Assistant, Siri, and other assistant technologies are doing at analyzing intent and reacting appropriately. While that can be a useful tool for tech firms wanting to refine their AI, it also moves users into the uncomfortable territory of not being sure who might be listening to their interactions with their assistant of choice.

Siri audio grading is getting a privacy overhaul

Now, Apple has decided to change the way it does all that. The company suspended its human audio grading program at the beginning of August, after reports that contractors could be hearing snippets of chats. That will, come the fall, resume, but in a very different way.

Advertisement

For a start, audio recordings of Siri interactions will no longer be retained by default. Instead, Apple will continue generating automatic transcriptions of Siri interactions, which have no human component, and use those to improve the assistant. Those transcripts will be saved for six months at most, and connected only with a random identifier and not a user's Apple ID.

Users will be able to opt in, and allow Apple to include their audio samples as it works on enhancing Siri. "We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place," the company says. "Those who choose to participate will be able to opt out at any time."

Advertisement

Even those audio recordings will be accessible by fewer people, however. Apple says that only its own employees will ever hear the audio samples, rather than external contractors. "Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri," Apple promises.

What Siri audio grading is - and what it isn't

Audio grading has become an instrumental part of refining assistant technologies, though Apple insists Siri has always used less data sharing than most. For the most part, the assessments are done automatically. The audio was used to create a computer-generated transcript, and sometimes both were then fed into a machine learning process to better train Siri.

Advertisement

"Before we suspended grading, our process involved reviewing a small sample of audio from Siri requests – less than 0.2 percent – and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability," Apple says. That would help figure out things like whether Siri correctly understood the request, and whether the assistant responded appropriately. It also helped identify possible false activations of Siri.

Those inadvertent triggers have become something of a hot topic for assistants in general, not just Apple's, because of what audio they could accidentally record and upload. After all, if you know you're going to issue a command or make a request to Siri, Alexa, or any of the other assistants, you do so with the knowledge that such a request will be known by Apple, Amazon, or whatever company operates that assistant.

Advertisement

Inadvertent triggers, however, could end up recording audio users would never expect to have uploaded. That might be as benign as the sound of rustling in your pocket or bag, or as personal as whatever you get up to in the bathroom or bedroom. "We work hard to minimize false triggers and have updated the review process to limit graders' exposure to them," Apple says. "When we resume grading, our team will work to delete any recording which is determined to trigger Siri inadvertently."

If you want to go further, the only option really is to disable Siri and Dictation altogether. The spotlight is now on Amazon, Google, and others, to see if they follow Apple's approach and not only allow users to opt out of audio sharing, but adopt the same, more stringent rules around how such data is handled overall.

Recommended

Advertisement