Apple announced the details of a new system that will allow customers to search for child sexual abuse material (CSAM). __S.3__
The technology searches for matches to CSAM before an image can be saved onto iCloud Photos.
Apple stated that, if there is a match, a human reviewer will assess the case and then report it to law enforcement.
Privacy concerns are raised by the possibility that this technology might be extended to scan mobile phones for any prohibited content, or political speech.
Experts are concerned that this technology may be misused by dictatorial governments in order to spy on their citizens.
Apple stated that the new iOS and iPadOS versions will include “new cryptography applications to limit the spread of CSAM on-line, while still allowing for privacy.”
This system compares images to a list of child sexual abuse photos that have been compiled by National Center for Missing and Exploited Children, (NCMEC), and other child safety organizations.
These images can then be translated into “hashes”, which are numerical codes that can “match” an Apple image.
Apple claims that the technology can also capture edited versions but identical images of original photos.
Apple stated that “Before an Image is Stored in iCloud Photo, an On-Device Matching Process is Performed for That Image Against The Known CSAM Hashes.”
According to the company, it had an “extremely high degree of accuracy” and less than one in one trillion chances per year of wrongly flagging an account.
Apple claims that each match will be manually reviewed by the company. Apple can take the necessary steps to remove a user from their account or report it to police.
According to the company, this new technology has “significant” privacy advantages over other techniques. Apple can only access photos of users if there is a known collection of CSAM within their iCloud Photos.
Privacy experts, however, have expressed concerns.
They have sent out a clear message, regardless of Apple’s future plans. Matthew Green, security researcher at Johns Hopkins University said that it was safe to create systems that can scan phones for banned content.
It doesn’t matter if they are right or wrong. It will cause the dam to burst. All governments will be demanding it.
Publited Fri, 6 Aug 2021 at 00:06.22 (+0000).