Apple delays plans for CSAM detection in iOS
15

Apple delays plans for CSAM detection in iOS 15

Apple delays plans for CSAM detection in iOS
15

Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it chaotically announced last month, citing feedback from customers and policy groups.

That feedback, if you recall, has been largely negative. Electronic Frontier Foundation announced this week that it has received more than 25.000 signatures from users. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.

Apple stated in a Friday morning statement that it had informed TechCrunch of the following:

Last month, we revealed plans to add features to protect children against predators using communication tools to exploit and recruit them and to limit the spread and abuse of Child Sexual Abuse Material. We have taken additional time to gather feedback and improve these critical child safety features based on the input of customers, advocates groups, researchers, and other stakeholders.

Apple’s NeuralHash technology, which is also used by Apple to detect known CSAM in a device’s user without having access to the images or the content of those images. Because a user’s photos stored in iCloud are end-to-end encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device, which Apple claims is more privacy friendly than the current blanket scanning that cloud providers use.

But security experts and privacy advocates have expressed concern that the system could be abused by highly resourced actors, like governments, to implicate innocent victims or to manipulate the system to detect other materials that authoritarian nation states find objectionable.

Continue reading:

Publited Fri, 3 Sep 2021 at 13:17.40 +0000

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.