- Apple plans to roll out software on US iPhones that will detect child sexual abuse images, the Financial Times reported.
- Human reviewers would then alert law enforcement if they think the images are illegal.
- Security experts warn that this could open the floodgates to extensive surveillance.
- See more stories on Insider’s business page.
Apple is reportedly planning to roll out software that will scan US iPhone photos for images of child sexual abuse, the Financial Times reported on Thursday.
Apple could announce more about the software in the coming week, according to the report, which cited security researchers familiar with Apple’s plans.
The software, reportedly called neuralMatch, is designed to look through images that have been stored on iPhones and uploaded to iCloud storage. According to the Financial Times, if the software detects child sexual abuse in a photo, it will then pass the material on to human reviewers who will alert law enforcement if they think the images are illegal.
However, security experts warned that this could snowball beyond looking for child sexual abuse images.
“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam – governments will demand it from everyone,” Matthew Green, a cryptographer at Johns Hopkins University, said on Twitter.
-Matthew Green (@matthew_d_green) August 5, 2021
An Apple spokesperson did not immediately respond to Insider’s request for comment, and the company declined to comment to the Financial Times.
Apple makes privacy a selling point, at times frustrating law enforcement
This new software, if implemented, would likely please law enforcement and government agencies, but risks potential backlash from privacy activists. Apple has made privacy features a cornerstone of its marketing in recent years, advertising that “what happens on your iPhone stays on your iPhone.”
But there are limits to this promise, and tradeoffs. Apple already monitors images sent from Apple devices for child abuse imagery, using a technique called “hashing,” and alerts law enforcement when the algorithm and an Apple employee detect suspected child abuse material. It also cooperates with law enforcement on lawful requests for information.
“Our legal team reviews requests to ensure that the requests have a valid legal basis,” Apple writes on its website. “If they do, we comply by providing data responsive to the request. If a request does not have a valid legal basis, or if we consider it to be unclear, inappropriate, or overly broad, we challenge or reject the request. We report on the requests every six months.”
In the past, Apple has resisted government agencies’ requests for the company to install a back door that would allow law enforcement to access encrypted messages. New York City police and prosecutors have criticized Apple’s encryption technology for aiding criminals in hiding information from law enforcement.
Other tech companies like Facebook have also been caught between protecting users’ privacy and requests from law enforcement and government agencies. Government officials in multiple companies have criticized Facebook’s encryption of its Messenger service for making it more difficult to detect content depicting child sexual exploitation.
Researchers told the Financial Times that Apple’s decision could pressure other companies into implementing similar kinds of monitoring and could later expand into monitoring of images beyond child sexual abuse, like anti-government signs held at protests.