Apple is delaying a controversial plan to scan users’ photos for child pornography after widespread outcry from privacy and civil liberties advocates.
The tool, called “neuralMatch,” is designed to scan images on Apple users’ devices before they’re uploaded to iCloud. The company also said that it planned to scan users’ encrypted messages for child pornography.
After Apple announced the effort in August, privacy advocates hit back at the company.
The Electronic Frontier Foundation racked up more than 25,000 signatures on a petition against the tool, while the American Civil Liberties Union said in a letter that the tool would “censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
Critics say the tool could easily be misused by repressive governments to track and punish users for all kinds of content — not just child pornography. Some have pointed to Apple’s seemingly accommodating relationship with the Chinese government as evidence that the company would allow the tool to be used.
Now, Apple appears to be listening to its critics.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement to multiple media outlets. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
It’s unclear when the company plans to release the features or what changes will be made.
Apple has said that the tool will only flag images that are already in a database of known child pornography, meaning parents who take photos of their children bathing would not be flagged, for example.