Apple’s choices in this case, though, go in the opposite direction: instead of adding CSAM-scanning to iCloud Photos in the cloud that they own and operate, Apple is compromising the phone that you and I own and operate, without any of us having a say in the matter. Yes, you can turn off iCloud Photos to disable Apple’s scanning, but that is a policy decision; the capability to reach into a user’s phone now exists, and there is nothing an iPhone user can do to get rid of it.
A far better solution to the “Flickr problem” I started with is to recognize that the proper point of comparison is not the iPhone and Facebook, but rather Facebook and iCloud. One’s device ought be one’s property, with all of the expectations of ownership and privacy that entails; cloud services, meanwhile, are the property of their owners as well, with all of the expectations of societal responsibility and law-abiding which that entails. It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.
Ben Thompson
Apple’s latest initiative to combat child sexual abuse has been met with widespread criticism – and I feel the majority are justified! As Ben Thompson points out in the article above, the underlying law does not require companies to actively scan for CSAM on their services – and yet, this is precisely what Apple is planning to do.
The entire story combines several negative tendencies in technology: Apple regarding iPhones as under their permanent control; the blunt use of machine learning and inscrutable algorithms in complex contexts, where the models can fail without anyone recognizing their errors; companies making decisions that impact large sections of the population without any accountability or oversight from democratic governments. Some have even pointed out that Apple has not properly addressed the iPhone’s security vulnerabilities to spyware, making this new image scanning system vulnerable to remote exploitation.
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
Expanded Protections for Children - Frequently Asked Questions
It is quite easy to uphold principles on paper – but when was the last time that Apple has said ‘No’ to a demand from the Chinese government?
Post a Comment