The Legal Examiner Affiliate Network The Legal Examiner The Legal Examiner The Legal Examiner search feed instagram google-plus avvo phone envelope checkmark mail-reply spinner error close
Skip to main content
Silhouette of young girl on laptop at desk.
Grewal Law, PLLC
(855) 610-0503

Apple announced that it will use algorithms to search and identify images of child sexual abuse on its users’ phones. The new policy was announced in partnership with the National Center for Missing and Exploited Children.

The new process will scan images on user devices before photos are uploaded to Apple’s iCloud storage. If a photo is identified as a match against the database of known Child Sexual Abuse Material (CSAM), it is reviewed by a human before shutting down the user account and notifying the National Center for Missing and Exploited Children. Law enforcement will not be notified.

The tool can only identify images that are already saved to the CSAM database. Apple reports that the chance of a photo being falsely identified as CSAM are one in one trillion.

The new protocol still ensures end-to-end encryption and user privacy. Apple does not learn anything about photos that do not match the known CSAM database. Additionally, Apple cannot access metadata or any other identifying information specific to a photo until the iCloud Photos account reaches a certain threshold of matches to the known CSAM database.

Critics of the new protocol cite user privacy and security concerns. However, groups advocating for protections for children from sexual abuse say the new measure balances the imperative for child safety with tech security.

Comments for this article are closed, but you may still contact the author privately.