
Apple announced that it will use algorithms to search and identify images of child sexual abuse on its users’ phones. The new policy was announced in partnership with the National Center for Missing and Exploited Children.
The new process will scan images on user devices before photos are uploaded to Apple’s iCloud storage. If a photo is identified as a match against the database of known Child Sexual Abuse Material (CSAM), it is reviewed by a human before shutting down the user account and notifying the National Center for Missing and Exploited Children. Law enforcement will not be notified.
The tool can only identify images that are already saved to the CSAM database. Apple reports that the chance of a photo being falsely identified as CSAM are one in one trillion.
The new protocol still ensures end-to-end encryption and user privacy. Apple does not learn anything about photos that do not match the known CSAM database. Additionally, Apple cannot access metadata or any other identifying information specific to a photo until the iCloud Photos account reaches a certain threshold of matches to the known CSAM database.
Critics of the new protocol cite user privacy and security concerns. However, groups advocating for protections for children from sexual abuse say the new measure balances the imperative for child safety with tech security.

Kelly R. McClintock joined Grewal Law in 2019 to help establish a human trafficking litigation division and to assist Grewal’s already successful practices, including sexual assault litigation, and family law.
Comments for this article are closed.