Apple expresses regret at confusion over “iPhone scanning”

0
71

A woman holding an iPhone

Reuters

Apple says its announcement of automated tools to detect child sexual abuse on the iPhone and iPad was “jumbled pretty badly”.

The company announced new software for image detection that alerts Apple if illegal images are stored in its iCloud storage.

Privacy groups criticized the news and some claimed that Apple created a security loophole in its software.

According to the company, its announcement was widely misunderstood.

“We wish that this had come out a little more clearly for everyone,” said Apple software chief Craig Federighi, in an interview with the Wall Street Journal.

In hindsight, he said that adding two elements at once was “a recipe to create confusion.”

Which are the latest tools?

Apple has announced two new tools to help children. These tools will be available in the US for the first time.

Image detection

When a user uploads images to iCloud storage, the first tool will identify child sex abuse material.

The US National Center for Missing and Exploited Children maintains a database of child abuse photos. They are stored as “fingerprints” – digital copies of illegal material.

These images are already checked by cloud service providers like Microsoft, Google, and Facebook to ensure that people don’t share CSAM.

Craig Federighi

Reuters

Apple implemented a similar process but stated that it would match images on an individual’s iPhone/iPad before uploading them to iCloud.

Federighi stated that the iPhone wouldn’t be looking at pornography or photos taken by your kids in the bathtub.

He said that the system couldn’t match exact fingerprints of known images of child sexual abuse.

Apple will flag accounts that contain images matching child abuse fingerprints.

Federighi stated that a user must upload at least 30 images to trigger this feature.

Message filtering

Apple announced a parental control feature that parents could enable on the accounts of their children in addition to its iCloud tool.

The system will check photos sent to or by the child via Apple’s iMessage app if activated.

The machine learning system would warn the child if it determined that the photograph contained nudity.

If the child wishes to see the photograph, parents can opt to be notified.

Criticism

Privacy organizations have expressed concern that this technology might be used to spy on citizens by dictatorial governments.

WhatsApp’s Will Cathcart, the WhatsApp chief, called Apple’s decision “very concerning”, while Edward Snowden (US whistleblower) called the iPhone “spyPhone”.

Federighi stated that the soundbyte that circulated after the announcement said that Apple was scanning iPhones to find images.

He told The Wall Street Journal, “That’s not the reality.”

We feel positive and passionate about our work and can clearly see the many misunderstandings.

These tools will be available in the next version of iOS and iPadOS, which is expected to arrive later in the year.

Publited Fri, 13 August 2021 at 17:45:22 (+0000).

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.