In August, Apple detailed several new features intended to stop the dissemination of child sexual abuse materials. Edward Snowden, privacy advocates, and cryptographers all reacted quickly to Apple’s decision to not only scan iCloud photos to determine CSAM but also to check your iPhone or iPad for matches. Apple has finally resigned after weeks of protests. For now, at least.
The company released a statement on Friday stating that it had plans to release features to protect children against predators using communication tools to exploit and recruit them and to limit the spread Child Sexual Abuse Material. We have taken additional time to gather feedback and improve these critical child safety features based on the input of customers, advocates groups, researchers, and other stakeholders.
Apple did not provide any additional guidance about the form or operation of these improvements. Privacy advocates and security experts are optimistic that the pause will be temporary.
Alex Stamos (ex-chief security officer of Facebook, and founder of the cybersecurity consultancy firm Krebs Stamos Group) says that “I believe this is an intelligent move by Apple.” This problem is extremely complex and Apple had to consider many factors before coming up with a solution.
CSAM scanners generate cryptographic “hashes”, which are known images of abusive content, and then search through large amounts of data to find matches. This is a common practice in many companies, such as Apple’s iCloud Mail. The company plans to expand that scanning to iCloud images, but it also suggested adding the step of verifying the hashes stored on the device, if applicable, if you have an iCloud Account.
The introduction of that ability to compare images on your phone against a set of known CSAM hashes–provided by the National Center for Missing and Exploited Children–immediately raised concerns that the tool could someday be put to other use. Riana Pfefferkorn is a research fellow at Stanford Internet Observatory. She says that Apple would have installed to every phone a CSAM scanning feature, which governments could and would subvert to make Apple search other material.
Apple refused multiple requests from the United States to create a tool to allow law enforcement to decrypt and unlock iOS devices. But the company has also made concessions to countries like China, where customer data lives on state-owned servers. The introduction of the CSAM tool came at a difficult time, as legislators all over the globe have intensified their efforts to break encryption.
Matthew Green, a Johns Hopkins University cryptographer says that “they clearly feel this to be politically challenging” They should scan the unencrypted files stored on their servers if they feel that they have to scan. This is standard for companies like Facebook which scan regularly for CSAM and terroristic content. Green also suggests that Apple should make iCloud storage end-to-end encrypted, so that it can’t view those images even if it wanted to.
Apple’s plans were also controversial on technical grounds. False positives can be generated by hashing algorithms, i.e. two images are mistakenly identified as matching even though they’re not. These errors, known as “collisions”, are particularly concerning when used in conjunction with CSAM. Researchers began to find collisions within the iOS “NeuralHash”, algorithm Apple had in mind, shortly after Apple announced it. Apple stated at that time that NeuralHash was different from the one used in the scheme and the current version was inaccurate. Paul Walsh, the founder and CEO at security company MetaCert says that collisions may not have a significant impact on practice. This is because Apple’s system needs 30 matching hashes to sound any alarms. After this, human reviewers will be able tell the difference between CSAM and false positives.
Apple’s potential changes to please its critics are not clear at the moment. Both Green and Pfefferkorn suggest that Apple could restrict its scanning to shared iCloud album albums, rather than involving customers’ devices. Stamos also stated that the NeuralHash issues highlight the need to include the entire research community from the beginning, particularly for a new technology.
Others insist that Apple should end its temporary pause. Evan Greer (deputy director, digital rights non-profit Fight for the Future), said that Apple’s proposal to scan photos and send messages on its devices is the most risky in tech history. It’s encouraging to see that Apple has been forced to postpone their dangerous and reckless surveillance plan due to backlash, but it is not safe. This plan must be abandoned.
Apple’s decision to delay its plans is an unusual concession for a company that is not usually inclined to make them. “I’m stunned, frankly,” says Pfefferkorn. It would be difficult for them to say they are dropping all plans, but “hitting pause” is still an enormous deal.
Additional reporting from Andy Greenberg.
- The latest on tech, science, and more: Get our newsletters!
- When the next animal plague hits, can this lab stop it?
- Wildfires used to be helpful. What made them so horrible?
- Samsung has its own AI-designed chip
- Ryan Reynolds called to do a favor in the Free Guy comeo
- A single software fix could limit location data sharing
- Explore AI like never before with our new database
- WIRED Games: Get the latest tips, reviews, and more
- Are you confused between the latest smartphones? Don’t worry, we have the best iPhone buying guide. Also check out our top Android phones
Publited Fri, 3 Sep 2021 at 17:22:22 +0000