Apple Has Called Off Its Child Sexual Abuse Scan Campaign. For Now
/img/iea/nWOVKmkAGo/apple-called-off-child-sexual-abuse-scan-campaign.jpg)
We can breathe a sigh of relief.
Apple is pushing back the launch of its child protection features, including the one designed to scan consumers' private photos for child sexual abuse material (CSAM), after public outcry and criticism that emphasized the damage such measures would do to user privacy, according to an initial report from The Verge.
But, after further tweaking, the megacorporation still plans to release child safety features.
Apple makes a significant pivot on child abuse scans
"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," said Apple in The Verge report. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features." This comes nearly a month after Apple's initial reveal of plans to scan every single iPhone in the U.S. for images of child sexual abuse, which raised substantial concerns about the prospect of entrusting private information to a private megacorporation that isn't exactly subject to public consent. If Apple had gone forward with its original plans, it could have given birth to a new sphere of legitimized surveillance on normal citizens.
Apple has also added a statement to the top of its first CSAM press release the original version of which pointed to three substantial changes underway, one of which suggested that Search and Siri would offer relevant CSAM-preventative resources if a user made a search for information regarding the issue. The other two changes, however, were deemed more controversial: One of them would inform parents if their kids received or sent sexually explicit photos, in addition to blurring these photos for kids. The third feature was designed to actively scan images backed up on the iPhone user's iCloud Photos for signs of CSAM, and automatically report them to moderators from Apple, who would be completely free to report any findings to the National Center for Missing and Exploited Children (NCMEC).
Privacy standards may be eroding across multiple industries
Apple had argued that is iCloud Photo scanning campaign didn't subvert user privacy, providing a lengthy explanation of the process of scanning iCloud Photos stored on one's iOS device, to compare and contrast the photos with a database of image hashes provided by the NCMEC and other organizations committed to child safety. But the response from security and privacy experts was a strong "no", countering that this would constitute an on-device surveillance system that violates the trust consumers had placed in Apple's services. An August 5 statement from the Electric Frontier Foundation portended the possibility of darker uses, should the company's CSAM measures go forward, arguing that good intentions would be swiftly negated, breaking "key promises of the messenger's encryption itself", and opening "the door to broader abuses."
"Apple is compromising the phone that you and I own and operate," argued Ben Thomson in his own critical piece at Stratechery, "without any of us having a say in the matter." Whether this is a stroke of luck or the first of a new, more democratic strategy from Apple that's more receptive to the will of the public, remains to be seen. But, since the company could still implement CSAM measures that are deemed a substantial overreach by public organizations and typical iPhone users, this could become nothing more than a short extension on the potential erosion of privacy standards.
This was a developing story and was regularly updated as new information became available.