When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Apple says its new child safety feature will look for images flagged in multiple countries

A multicoloured Apple logo on a black background

Recently, Apple introduced a new child safety feature that can scan images related to child sexual abuse in iCloud Photos of users on behalf of governments. However, this decision was followed by severe criticism from privacy advocates.

Now, addressing the criticism, Reuters reports that the tech giant has clarified that it will employ the feature to scan images that have "been flagged by clearinghouses in multiple countries". The automated scanning system will only alert Apple once it crosses an initial threshold of 30 images so that a human reviewer can manage the issue.

The company said that the number would ultimately be decreased in time to come. It also made it clear that its list of image identifiers is universal and will be the same for any and every device it will be applied to.

Apple further elucidated that its implementation produces an encrypted on-device Child Sexual Abuse Material (CSAM) hash database acquired from no less than two or more organizations working under the patronage of separate national governments.

The tech giant didn't say if the repercussions had any impact on its position but it did say that there was "confusion" regarding its initial announcements and also, that the program is "still in development".

Source: Reuters

Report a problem with article
Debian 11 Homeworld theme
Next Article

Debian 11 "bullseye" stable release is finally out, brings exFAT support and more

writing
Previous Article

Save 98% off the 2021 Become a Freelance Writer Bootcamp Bundle

Join the conversation!

Login or Sign Up to read and post a comment.

17 Comments - Add comment