Apple's Federighi Admits "Hindsight" As iPhone Child Protection Fallout Lingers
A high-profile Apple exec has conceded that its big reveal of new child protection tools could've gone more smoothly, with software chief Craig Federighi admitting that "in hindsight" confusion was inevitable. Apple announced plans to scan iCloud Photos uploads for illegal images of children, as well as build new grooming protections into iMessage, earlier in August, and was met with instant pushback from privacy advocates and others.
The two systems, Apple explained, would both be focused on protecting younger users and children, though work in different ways. On the one hand, iCloud Photos would scan backed-up images for child sexual abuse materials (CSAM), comparing them to known illegal content. If that was spotted, after a human review, the user could be reported to the authorities.
In Apple's Messages app, meanwhile, parents would be able to turn on new filters for younger users. Should someone subsequently send them an image that was flagged in the system as being explicit, that picture would be blurred out and guidance on how to deal with potential grooming situations offered. If the child still opened the picture, there'd optionally be an alert sent to the parents about it.
While Apple may have expected plaudits for the moves, what it received instead was a considerable degree of suspicion about the technology. Concern was raised about whether governments could demand the expansion of the iCloud Photos scanning, introducing pictures other than CSAM that might be used to track political opposition, protesters, journalists, and more. The technology could be a slippery slope, it was argued, once Apple demonstrated it was theoretically possible.
Apple pushed back on that earlier this week, with a new FAQ attempting to settle fears. It would resist pressure to expand the scanning beyond its current intended purpose, Apple insisted, and protections had been built in to avoid accidental flagging and reports. Still, weeks after announcement, it's clear that the furore isn't going to settle any time soon.
Speaking to the WSJ, Craig Federighi, Apple senior vice president of software engineering, conceded that the launch could've gone better. Still, he frames it as one of misunderstanding among the audience, rather than Apple primarily being at fault.
"It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood," Federighi said. "We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing."
Contrary to concerns that this was a potential hit on user privacy, Federighi argues that the technology – which runs locally on users' devices, rather than in the cloud, using so-called "hashed" data from CSAM images provided by third-parties rather than the images themselves – is in fact more beneficial to privacy overall. There will be "multiple levels of auditability," he says, and the system will be tuned so that, say, innocent photos a parent might take of a child in the bathtub aren't flagged as child pornography.
"If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images," he explains.
There will be multiple child-safety organizations responsible for collating the hashed image lists to which iCloud Photos uploads are compared, the software chief says. An independent auditor will also be verifying that the database contains those images alone. Even so, it's not just external observers voicing concerns; according to a Reuters report, some Apple employees have also been speaking out against the system internally.
As close as Federighi gets to an admission that Apple's strategy could be problematic is a reference to the announcement, not its content. "In hindsight," he says, "introducing these two features at the same time was a recipe for this kind of confusion. By releasing them at the same time, people technically connected them and got very scared: What's happening with my messages? The answer is...nothing is happening with your messages."
The new features will go live later in the year, with the arrival of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. iCloud Photos upload scanning will only take place if users have enabled backups to the service; locally-saved images won't be scanned. However, any existing images marked for backup will be scanned too, Apple has confirmed.