A group of cybersecurity researchers recently stated that Apple’s planned iCloud photo scanning tech designed to detect images of child sexual abuse is “invasive, ineffective, and dangerous.”
The New York Times reports that a group of researchers have published a 46-page study criticizing Apple’s planned CSAM (Child Sexual Abuse Material) scanning tech which would comb through iCloud users’ photos and videos for any possible abusive media.
According to the researchers, documents published by the European Union show that the EU’s governing body is considering implementing a similar program to scan encrypted mobile devices for child asexual abuse imagery as well as any possible organized crime and terrorism-related imagery.
The researchers stated that: “It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens.” The researchers added that they published their findings recently in an effort to inform the European Union of the possible dangers of its plan.
The New York Times writes:
“The expansion of the surveillance powers of the state really is passing a red line,” said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.
Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple’s announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.
“It’s allowing scanning of a personal private device without any probable cause for anything illegitimate being done,” added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. “It’s extraordinarily dangerous. It’s dangerous for business, national security, for public safety and for privacy.”
Apple faced significant pushback against the program from privacy advocates such as Edward Snowden, security researchers, and even employees from within its own firm. Apple initially tried to calm fears over the issue by releasing detailed information on its plans but eventually delayed the rollout of the feature in order to give the firm time ott make “improvements” to the CSAM system.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com
COMMENTS
Please let us know if you're having issues with commenting.