The system uses hashing and cypher tech to match images to images held in a database of the National Center for Missing and Exploited Children in the US
Apple has run into a backlash after announcing that it intended to scan iPhone photo libraries for images of abused children.
The system uses hashing and cypher tech to match images to images held in a database of the National Center for Missing and Exploited Children in the US.
Apple said: “CSAM [child sexual abuse material] detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud photos.”
The feature is coming in an update later this year to iCloud for iOS 15, iPadOS 15, and macOS Monterey.
The plan has been criticised over privacy concerns.
Former National Security Agency computer security expert Edward Snowden said on his Twitter feed: “No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this.
“Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs – without asking”.
Apple claims that its technology is more private than Google and Microsoft tech used to detect illegal child abuse images on their servers. It also said governments cannot force Apple to add non-CSAM images to its system.