Apple has announced details of a system for finding child sexual exploitation material (CSAM) on the devices of US customers
Before an image is saved to iCloud Photos, the technology will search for already known CSAM compatibility.
Apple said that if a match is found, a human reviewer will evaluate it and report the user to law enforcement.
However, there are privacy concerns that the technology could be extended to scan phones for banned content or even political speech.
Experts worry that the technology could be used by authoritarian governments to spy on their citizens.
Apple said that new versions of iOS and iPadOS, due to be released later this year, will have “new encryption practices that will help limit the online spread of CSAM while designing for user privacy.”
The system works by comparing the images to a database of child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety agencies.
Breaking news: Apple to scan iPhones for child sex abuse images:
These images are converted to numeric codes that can be “adapted” to an image on an Apple device.
Apple says the technology will capture edited but similar versions of the original images.
“Before an image is saved to iCloud Photos, the device adapts to known CSAM hashes for that image,” Apple said.
The company claimed that the system “has an extremely high level of accuracy, providing less than a trillion chances per year to misregister a particular account.”
Apple says it will manually review each report to confirm compatibility. It can then take steps to deactivate a user’s account and report it to law enforcement.
The company says the new technology offers “significant” privacy advantages over existing techniques—because Apple only learns about users’ photos if their iCloud Photo account has a well-known CSAM collection.
However, some privacy experts have expressed concerns.
“Whatever Apple’s long-term plans, they’ve sent a very clear signal. In their (very effective) view, it’s safe to install systems that scan users’ phones for banned content,” security researcher Matthew Green said. This was reported by Johns Hopkins University.
“At this point, it doesn’t matter if they’re right or wrong. It will break the matter – governments will demand it from everyone.”
For the latest news updates keep in touch with ReportingHour