Apple Reacts to Controversial Photoscanning Plan

The company was forced to halt after enduring backlash against its new system for looking for child sexual abuse material on devices.

In August, Apple detailed several new features intended to stop the dissemination of child sexual abuse materials. Edward Snowden, privacy advocates, and cryptographers all reacted quickly to Apple’s decision to not only scan iCloud photos to determine CSAM but also to check your iPhone or iPad for matches. Apple has finally resigned after weeks of protests. For now, at least.

The company released a statement on Friday stating that it had plans to release features to protect children against predators using communication tools to exploit and recruit them and to limit the spread Child Sexual Abuse Material. We have taken additional time to gather feedback and improve these critical child safety features based on the input of customers, advocates groups, researchers, and other stakeholders.

Apple did not provide any additional guidance about the form or operation of these improvements. Privacy advocates and security experts are optimistic that the pause will be temporary.

Alex Stamos (ex-chief security officer of Facebook, and founder of the cybersecurity consultancy firm Krebs Stamos Group) says that “I believe this is an intelligent move by Apple.” This problem is extremely complex and Apple had to consider many factors before coming up with a solution.

CSAM scanners generate cryptographic “hashes”, which are known images of abusive content, and then search through large amounts of data to find matches. This is a common practice in many companies, such as Apple’s iCloud Mail. The company plans to expand that scanning to iCloud images, but it also suggested adding the step of verifying the hashes stored on the device, if applicable, if you have an iCloud Account.

The introduction of that ability to compare images on your phone against a set of known CSAM hashes–provided by the National Center for Missing and Exploited Children–immediately raised concerns that the tool could someday be put to other use. Riana Pfefferkorn is a research fellow at Stanford Internet Observatory. She says that Apple would have installed to every phone a CSAM scanning feature, which governments could and would subvert to make Apple search other material.

Apple refused multiple requests from the United States to create a tool to allow law enforcement to decrypt and unlock iOS devices. But the company has also made concessions to countries like China, where customer data lives on state-owned servers. The introduction of the CSAM tool came at a difficult time, as legislators all over the globe have intensified their efforts to break encryption.

Publited Fri, 3 Sep 2021 at 17:22:22 +0000

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.