In a bid to shield children from pedophiles, Apple has introduced new child safety features for its entire ecosystem. As a part of the effort, Apple will scan the contents of your Messages app and iCloud using "on-device machine learning". The Cupertino company claims that the tools are meant to detect Child Sexual Abuse Material and won't be used for accessing your messages and photographs.
These new features will be rolled out later this year to iPhone, iPad, and Mac. Once updated, the new Messages app will warn children about sending and receiving explicit photos. For instance, upon receiving an explicit image, the phone will display the following message: "This could be sensitive to view. Are you sure?" If the kid decides to open it anyway, the phone can notify parents.
The new safety feature will also scan child abuse-related images in your iCloud Photos. As per the company, such photographs are matched using a "cryptographic technology called private set intersection, which determines if there is a match without revealing the result." This process utilizes data provided by the National Center for Missing and Exploited Children (NCMEC). According to Apple, it can't view the contents of your iCloud Photos unless child abuse images are detected by the system. It is said to provide a high level of accuracy with the chances of incorrectly flagging an account to be less than one in a trillion.
With the upcoming update, Siri and Search will intervene if a user searches for inappropriate content related to kids. The user will be told that "Material depicting the sexual abuses of children is illegal. Anonymous helplines and guidance exist for adults with at-risk thoughts and behavior."
This is a great initiative that utilizes cutting-edge technology to protect children from predators. However, with a few tweaks, it may also be used against people with certain opinions on politics, healthcare, or the economy that don't align with that of Apple. For now, we just have to trust Apple to use this technology ethically, which is a tough ask given that it is the same company that lobbied against a bill aimed at stopping forced labor in China.