Apple has decided to postpone plans for the rollout of a child safety feature that scanned hashes of iCloud Photos uploads in order to determine if users are storing child sex abuse material (CSAM).
Marvel's Guardians of the Galaxy is a fantastic but restricted adventure
guardians of the galaxy
Build 22000.282 fixes for Ryzen L3 cache performance issue, and more
windows 11 insider preview
Icloud photos RSS
Apple has addressed privacy concerns regarding its sex abuse scanning by clarifying that the new feature would only flag accounts with at least 30 iCloud photos matching Child Sexual Abuse Material.
Apple has provided more details about its child safety photo scanning technologies that have been drawing some fire from critics. It has also described the end-to-end flow of its review process.
In an internal memo, Apple's Software VP has acknowledged that people are worried about the company scanning iCloud Photos for child sex abuse material, but says that this is due to misunderstandings.