• Technology
  • Electrical equipment
  • Material Industry
  • Digital life
  • Privacy Policy
  • O name
Location: Home / Technology / Apple responds to critics of CSAM scan plan with FAQs - says it'd block governments subverting its system

Apple responds to critics of CSAM scan plan with FAQs - says it'd block governments subverting its system

techserving |
2618

Apple's announcement last week that it will soon be scanning photos on iPhones and iPads that sync to iCloud for child sexual abuse material (CSAM) prompted pushback from thousands of security and privacy professionals and a response from the company that attempts to mollify its critics.

The iDevice biz revealed two

child safety initiatives

that are initially being rolled out in the US and later in other countries depending on regulatory approval.

One is a system to alert children and their parents when the Messages app sends or receives pictures deemed explicit (but not necessarily illegal) by an on-device machine learning algorithm.

The other is system to scan photos on iOS and iPadOS devices that sync to iCloud Photos to see if the hashes (identifiers) of on-device images match any hashes of known CSAM material (illegal), which Apple will use to flag iCloud accounts for cancellation and reporting to the National Center for Missing and Exploited Children (NCMEC).

Apple published a technical summary [

PDF

] of its systems and said that for its CSAM scheme there's only "a one in one trillion chance per year of incorrectly flagging a given account." It, however, provided

no way to verify

that figure, according to Princeton professor Jonathan Mayer.

Went about as well as you'd expect

The announcement [

PDF

] elicited a swift reaction in the form of

an open letter

opposing the move for its potential harm to privacy and security.

"Apple's current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases," said the letter, which currently lists more than 6,000 signatures. "We ask that Apple reconsider its technology rollout, lest it undo that important work."

Technical experts, privacy advocates, academics, and others have spent the past weekend debating the issue via social and online media. Pretty much everyone agrees that CSAM is a problem.

The question is whether a company that has said, "Privacy is a human right" – even if it doesn't offer that in China – should be attempting to tackle child safety by running its own scanning code – with consent obtained via the decision to use iCloud Photos rather than explicit permission – on its customers' devices.

Apple is about to start scanning iPhone users' devices for banned content, warns professor

You may be distracted by the pandemic but FYI: US Senate panel OK's backdoors-by-the-backdoor EARN IT Act

Don't be fooled, experts warn, America's anti-child-abuse EARN IT Act could burn encryption to the ground

Upcoming Android privacy changes include ability to blank advertising ID, and 'safety section' in Play store

Alex Stamos, director of the Stanford Internet Observatory and former CSO of Facebook, attempted to stake out the middle ground between those horrified by Apple's approach and those who would give the fight against CSAM priority over any other considerations.

"In my opinion, there are no easy answers here," wrote Stamos in a

Twitter thread

, insisting it's okay to have nuanced opinions on these issues. "I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies."

He said he's happy to see Apple finally take some responsibility for the impact of its massive platform but is also frustrated with its approach. "They both moved the ball forward technically while hurting the overall effort to find policy balance."

Stamos has

speculated

this system could allow Apple to introduce end-to-end encryption for iCloud backups by preempting the inevitable concern about CSAM that would come up if it did so. Apple however had not publicly stated any intention to deploy full iCloud encryption.

All down to Apple

On Monday, Eric Rescorla, CTO of Mozilla, published

a technical analysis of Apple's syste

m

that suggests the security of the company's CSAM scanning effort depends on Apple behaving in a trustworthy manner. "It's important to realize that there's nothing in the system that prevents Apple from scanning photos that never leave the device; they've just chosen not to do so," he wrote.

Apple's "child safety" initiative represents a major shift for the company that just a few years ago cited the importance of "personal safety" by rejecting the FBI's request that it modify its software

to decrypt the iPhone of a terror suspect

.

In

an open letter

to its customers in 2016, Apple explained its defense of privacy by stating, "we believe the contents of your iPhone are none of our business."

"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation," the company said. "In the wrong hands, this software – which does not exist today – would have the potential to unlock any iPhone in someone’s physical possession."

Starting with forthcoming operating system updates iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, the contents of your iPhone will be Apple's business if you sync images to iCloud.

Security experts are concerned Apple's system will allow government authorities to demand that the company add non-CSAM image hashes to its detection list to ferret out photos deemed unacceptable for political, religious, or other reasons unrelated to child safety.

Five years ago, Apple said the government could be expected to demand that sort of technical intervention in a legal filing [

PDF

] opposing the FBI's request to modify its software.

"Here, if Apple is forced to create software in this case, other law enforcement agencies will seek similar orders to help them hack thousands of other phones, as FBI Director Comey confirmed when he said he would 'of course' use the All Writs Act to 'return to the courts in future cases to demand that Apple and other private companies assist . . . in unlocking secure devices,'" explained the company's legal representatives.

Would Apple cave?

Yet in the FAQs published on Monday, Apple attempts to ally concern that authorities could demand access to its CSAM system for other surveillance purposes by stating that the company would simply resist. "Could governments force Apple to add non-CSAM images to the hash list?" Apple asks, and then answers, "Apple will refuse any such demands."

Legal experts have not been impressed. "So basically: all that stands between users and governments demanding [the addition] of non-CSAM images to the hash list is Apple's firm refusal?" said Elizabeth Joh, law professor at UC Davis,

via Twitter

.

To which Daphne Keller, Platform Regulation Director, Stanford Cyber Policy Center, and former Associate General Counsel at Google,

replied

, "Speaking as someone who has litigated and lost on this exact issue in three countries (UK, Germany, France), I feel confident in saying the firm refusal to filter for new things beyond CSAM doesn’t mean much in the face of state power."

Indeed, when China directed Apple to enforce its ban on VPN software, the company

complied

.

But there's a more basic concern about Apple's utilization of its customers' devices in a way that might be used against them: ownership and control. Ben Thompson, a business analyst who writes the

Stratechery blog

, described Apple's approach as a mistake.

"One’s device ought to be one’s property, with all of the expectations of ownership and privacy that entails; cloud services, meanwhile, are the property of their owners as well, with all of the expectations of societal responsibility and law-abiding which that entails," he wrote.

"It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours." ®

Get our

Tech Resources