Apple will scan all U.S. iPhones for illegal child abuse imagery, prompting privacy concerns
Apple plans to scan all iPhone in the U.S. for potential child abuse imagery.
The move announced Thursday generated shock waves among security experts who say it could allow the company to surveil many millions of phones for reasons unrelated to images of child abuse.
"This sort of tool can be a boon for finding child pornography in people's phones. But image what it could do in the hands of an authoritarian government," tweeted Johns Hopkins professor and cryptographer Matthew Green.
The Big Tech company says the new scanning technology will be a part of iOS 15, which is set for release this month. It is part of a number of new child protection programs being initiated by the company that will "evolve and expand" over time.
In a blog post, the company wrote, "This innovative new technology allows Apple to provide valuable and actionable information to [the National Center for Missing and Exploited Children] and law enforcement regarding the proliferation of known CSAM," – an acronym for child sex abuse material.
"And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM."
The tech giant says it will use breakthrough cryptography technology and AI to locate abusive material stored in iCloud Photos. Those images will in turn be matched with a database of illegal images, and will trigger a review by the company if a certain number of images are uploaded. If those images are deemed illegal, the company will report them to the National Center for Missing and Exploited Children.
"The reality is that privacy and child protection can coexist," said president and CEO of the National Center for Missing and Exploited Children, John Clark, calling the company's new program a "game-changer."