Thousands sign letter asking Apple to scrap plans to scan users' photos for child abuse images

Apple spying

An open letter signed by privacy advocates, security experts, technology companies and legal specialists has been sent to Apple, decrying the company's plans to scan the photos of Mac, iPad and iPhone users for evidence of child abuse.

While on the face of it, Apple's "Expanded Protections for Children" plans are a good thing, it has also come in for heavy criticism. With the release of macOS Monterey, iOS 15 and iPad OS 15, the company is implementing CSAM (Child Sexual Abuse Material) detection which will check image hashes to see if they feature in databases of known abuse images. It has been likened to creating a backdoor to users' files and has horrified privacy experts.

In a post in the Child Safety section of its website, Apple says that it wants "to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)". And it is with this in mind that the company plans to scan images sent via Messages or uploaded to iCloud for "known CSAM images".

Or, as Apple puts it:

Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

There are several concerning elements here, not least of which is the potential for false positives  (as image hashes are not unique, meaning an abuse image could have the same hash as something entirely innocent).

Among those voicing worries about the plans is Edward Snowden. As well as expressing deep concerns about what Apple is thinking about doing, he was also appalled by Apple's response to the negative reaction which saw the company referring to those who disagreed with its plans as "screeching voices":

Privacy advocates EFF (Electronic Frontier Foundation) says it is huge invasion of privacy:

Apple's plan for scanning photos that get uploaded into iCloud Photos is similar in some ways to Microsoft's PhotoDNA. The main product difference is that Apple's scanning will happen on-device. The (unauditable) database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found. This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone. The result of the matching will be sent up to Apple, but Apple can only tell that matches were found once a sufficient number of photos have matched a preset threshold.

Once a certain number of photos are detected, the photos in question will be sent to human reviewers within Apple, who determine that the photos are in fact part of the CSAM database. If confirmed by the human reviewer, those photos will be sent to NCMEC, and the user's account disabled. Again, the bottom line here is that whatever privacy and security aspects are in the technical details, all photos uploaded to iCloud will be scanned.

Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement.

In a massive coming together of opposing minds, thousands have signed an open letter that complains that "Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products".

The letter says:

Apple's current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases. We ask that Apple reconsider its technology rollout, lest it undo that important work.

The signatories make two requests:

  1. Apple Inc.'s' deployment of its proposed content monitoring technology is halted immediately.
  2. Apple Inc. issue a statement reaffirming their commitment to end-to-end encryption and to user privacy.

The full letter and a list of those that have signed it can be seen here.

8 Responses to Thousands sign letter asking Apple to scrap plans to scan users' photos for child abuse images

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.