Apple defends new photo scanning child protection
Tech

Apple defends new photo scanning child protection Tech

Apple logo

Getty Images

After backlash from privacy advocates and customers, Apple has defended the new scanning system it uses to scan phones for child abuse material (CSAM). __S.3__

Before the photo is stored in iCloud, it searches for known abusive material matches.

Critics warned it could be a “backdoor” to spy on people, and more than 5,000 people and organisations have signed an open letter against the technology.

Apple pledged to not “expand the system” for any reason.

Last week, digital privacy activists warned that an authoritarian government could make use of the technology to support anti-LGBT regimes or clamp down on dissidents living in nations where demonstrations have been deemed illegal.

Apple stated that it will not accept any request from the government to expand the system.

It published a question-and-answer document, saying it had numerous safeguards in place to stop its systems from being used for anything other than the detection of child abuse imagery.

We have repeatedly refused to comply with government mandates that would compromise the privacy rights of our users. It stated that it would continue to reject them.

Apple made concessions in the past to ensure that it can continue operating in other countries.

The tech company removed 39,000 apps last New Year’s Eve from the Chinese App Store as a result of a crackdown by the authorities on illegal games.

Apple said that its anti-CSAM software will not permit the company to scan a user’s photo albums. Only photos shared via iCloud will be scanned by it.

Based on a list of hashes of known CSAM photos provided by child safety organizations, the system will search for matches on the device.

Apple claims that it’s almost impossible for innocent persons to be falsely flagged by police. It stated that “less than 1 in 1000 trillion accounts would be incorrectly flagged each year.” A human review is available for positive matches.

Privacy advocates argue that Apple’s promises that they will not allow that technology to be used for other purposes is the best thing that can stop it from being.

Digital rights group the Electronic Frontier Foundation, for example, said that “all it would take… is an expansion of the machine learning parameters to look for additional types of content”.

It warned that “This is not a slippery slope” and pointed out that it was a system built to withstand external pressure.

Apple provided assurances about another feature, which will alert parents and children using linked family accounts when explicit images are received or sent.

According to the company, its new features don’t use the same technology and it won’t have access to private messages.

Privacy advocates were outraged by Apple’s decision, but some lawmakers welcomed it.

Sajid Javid, UK Health Secretary, said that it was now time for other social media platforms, including Facebook, to do the same.

Publited at Mon, 9 Aug 2021 16,46:06 +0000

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.