Saturday, October 23, 2021
HomeTechnologyApple plans to scan US iPhones for child abuse imagery

Apple plans to scan US iPhones for child abuse imagery

Enlarge / The iPhone 2020 range. From left to right: iPhone 12 Pro Max, iPhone 12 Pro, iPhone 12, iPhone SE and iPhone 12 mini.

Apple intends to install software on U.S. iPhones to search for images of child abuse, people briefed on its plans say, alarming security researchers who warn it could open the door to surveillance. personal devices of millions of people.

Apple detailed its proposed system – known as “neuralMatch” – to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be made public more widely as early as this week, they said.

The automated system would proactively alert a team of human examiners if it believes illegal footage is detected, who would then contact law enforcement if the material can be verified. The program will initially only be rolled out in the United States.

Apple declined to comment.

The proposals are Apple’s attempt to strike a compromise between its own promise to protect customer privacy and continued demands from governments, law enforcement and child safety activists for more help in criminal investigations, including terrorism and child pornography.

The tension between tech companies like Apple and Facebook, which have championed their growing use of encryption in their products and services, and law enforcement has only intensified since the iPhone maker went. sued with the FBI in 2016 for access to the iPhone of a terrorist suspect. following a shooting in San Bernardino, California.

Security researchers, while supporting efforts to fight child abuse, fear Apple will allow governments around the world to request access to their citizens’ personal data, potentially far beyond its original intent. .

“This is an absolutely appalling idea, because it will lead to distributed mass surveillance of files… Our phones and laptops,” said Ross Anderson, professor of security engineering at the University of Cambridge.

Although the system is currently trained to detect child sexual abuse, it could be adapted to search for other targeted images and texts, for example, terrorist beheadings or anti-government signs during protests, the researchers said. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.

“It will break the barrier – governments will demand it from everyone,” said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet On the question.

Alec Muffett, a security researcher and privacy activist who previously worked at Facebook and Deliveroo, said Apple’s move was “tectonic” and “a huge, regressive step for individual privacy.”

“Apple is rolling back confidentiality to allow 1984,” he said.

Cloud-based photo storage systems and social networking sites already search for images of child abuse, but this process becomes more complex when trying to access data stored on a personal device.

Apple’s system is less invasive as the filtering is done over the phone and “only if there is a match, a notification is returned to those looking,” said Alan Woodward, professor of computer security at the ‘University of Surrey. “This decentralized approach is about the best approach you can take if you go this route.”

Apple’s neuralMatch algorithm will continuously scan photos stored on a US user’s iPhone that have also been uploaded to their iCloud backup system. User photos, converted into a string of numbers by a process known as “hashing”, will be compared to those from a database of known child sexual abuse images.

The system was trained on 200,000 images of sexual abuse collected by the United States National Center for Missing and Exploited Children.

According to those briefed on the plans, every photo uploaded to iCloud in the US will receive a “security voucher” indicating whether or not it is suspicious. Once a certain number of photos are marked as suspicious, Apple will decrypt all suspicious photos and, if apparently illegal, forward them to the appropriate authorities.

© 2021 The Financial Times Ltd. All rights reserved. Must not be redistributed, copied or modified in any way.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments