Apple Child Safety CSAM Scanning To Expand On A Per-Country Basis
Apple's much-lauded privacy efforts hit a sour note a few days ago when it announced a new feature intended to protect children by reporting illegal content that has been stored on a user's iCloud Photos account. While those on both sides of the argument agree that kids have to be protected by cracking down on Child Sexual Abuse Material or CSAM, critics argue that this system could be potentially abused by other governments. Apple now clarifies that it won't be the case since it will consider the CSAM detection's rollout on a case-to-case basis per country.
Privacy advocates have unsurprisingly labeled Apple's new CSAM scanning feature as spyware or surveillance software because of how it could potentially violate a person's privacy, despite Apple's assurances. At the root of the contention is the method of detecting the presence of CSAM content in photos, which involves the use of AI and machine learning in order to avoid having humans scan photos manually. While that in itself is a form of privacy protection, it also opens the door to abuse and potential privacy violations.
Critics argue that the machine learning system can be fed other data, intentionally or accidentally, that could then be used to detect and report content unrelated to CSAM. The technology, for example, could be used as a mass surveillance system for activists in countries with more repressive governments. Apple has already indicated its intention to expand its CSAM detection to iPhones and iPads around the world, adding fuel to the controversy.
Apple has now clarified that it won't be making a blanket rollout without considering the specifics of each market's laws. This could provide some comfort to citizens in China, Russia, and other countries which have strong censorship laws. CSAM detection will roll out first in the US, where Apple has long been a staunch privacy ally.
That might not satisfy privacy advocates, however, as they see the system itself as open to abuse. Apple has repeatedly denounced the creation of backdoors into strong security systems, but it is now being criticized for creating exactly that, no matter how narrow or well planned that backdoor is.