Apple’s choices in this case, though, go in the opposite direction: instead of adding CSAM-scanning to iCloud Photos in the cloud that they own and operate, Apple is compromising the phone that you and I own and operate, without any of us having a say in the matter. Yes, you can turn off iCloud Photos to disable Apple’s scanning, but that is a policy decision; the capability to reach into a user’s phone now exists, and there is nothing an iPhone user can do to get rid of it.
A far better solution to the “Flickr problem” I started with is to recognize that the proper point of comparison is not the iPhone and Facebook, but rather Facebook and iCloud. One’s device ought be one’s property, with all of the expectations of ownership and privacy that entails; cloud services, meanwhile, are the property of their owners as well, with all of the expectations of societal responsibility and law-abiding which that entails. It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.
Ben Thompson
Apple’s latest initiative to combat child sexual abuse has been met with widespread criticism – and I feel the majority are justified! As Ben Thompson points out in the article above, the underlying law does not require companies to actively scan for CSAM on their services – and yet, this is precisely what Apple is planning to do.
One of the basic problems with Apple's approach is that they seem desperate to avoid building a real trust and safety function for their communications products. There is no mechanism to report spam, death threats, hate speech, NCII, or any other kinds of abuse on iMessage.
— Alex Stamos (@alexstamos) August 7, 2021
Instead, we get an ML system that is only targeted at <13 year-olds (not the largest group of sextortion/grooming targets in my experience), that gives kids a choice they aren't equipped to make, and notifies parents instead of Apple T&S.
— Alex Stamos (@alexstamos) August 7, 2021
Reminder: Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure.
— 🇪🇺🇫🇷Nadim Kobeissi🇫🇷🇪🇺 (@kaepora) August 5, 2021
What happens when local regulation mandates that messages be scanned for homosexuality?
I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.
— Will Cathcart (@wcathcart) August 6, 2021
People have asked if we'll adopt this system for WhatsApp. The answer is no.
The fucked up thing here is that design decisions about a worldwide surveillance architecture of social control get made by a small clique of individual companies, with no accountability to the billions of people their decisions affect, and no consequences for getting it wrong. https://t.co/m8cgRYgVUN
— Pinboard (@Pinboard) August 6, 2021
The entire story combines several negative tendencies in technology: Apple regarding iPhones as under their permanent control; the blunt use of machine learning and inscrutable algorithms in complex contexts, where the models can fail without anyone recognizing their errors; companies making decisions that impact large sections of the population without any accountability or oversight from democratic governments. Some have even pointed out that Apple has not properly addressed the iPhone’s security vulnerabilities to spyware, making this new image scanning system vulnerable to remote exploitation.
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
Expanded Protections for Children - Frequently Asked Questions
It is quite easy to uphold principles on paper – but when was the last time that Apple has said ‘No’ to a demand from the Chinese government?
Post a Comment