Today, Apple announced three new child safety features for its operating systems that will launch when its operating systems are updated in the fall. The implementation details of the features are technically complex, which makes reading the full documentation worthwhile if you are concerned about how they are accomplished.
The first feature is a tool for parents that will be built into Messages. According to Apple:
The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.
The opt-in tool will “warn children and their parents when receiving or sending sexually explicit photos.”
The second feature applies to photos stored online in users’ iCloud Photos library. Apple says:
iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
The screening of iCloud Photos images happens on-device using cryptographic hashes of known CSAM content and has to pass a human review process after passing certain thresholds before an account is disabled, and a report is made to the National Center for Missing and Exploited Children. The feature will be US-only at first.
Finally, Apple announced that:
[it] is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Siri and Search will also intervene when CSAM-related search requests are made.
To understand better how these features are being implemented by Apple, it’s worth visiting its new child safety webpage. At the bottom of the page are links to additional resources that explain the technology underlying the features.