• Technology
  • Electrical equipment
  • Material Industry
  • Digital life
  • Privacy Policy
  • O name
Location: Home / Technology / EU plans to police child abuse raise fresh fears over encryption and privacy rights

EU plans to police child abuse raise fresh fears over encryption and privacy rights

techserving |
1225

Internet service providers and communications companies could face fines of up to 6% of their turnover if they fail to comply with court orders requiring them to electronically scan communications and social media to identify possible child abuse.

A draft regulation due to be released by the European Commission today will require tech companies to deploy algorithms to identify child abuse images and attempts at grooming on communications and social media services.

The proposals have raised concerns from civil society groups, some MEPs and technologists that they will weaken end-to-end encryption services, such as WhatsApp and Signal, undermining the privacy rights of innocent people.

MEP and civil rights activist Patrick Breyer described the proposed regulation as a step towards Chinese-style state surveillance. “With its plans to break secure encryption, the EU Commission is putting the overall security of our private communications and public networks, trade secrets and state secrets at risk to please short-term surveillance desires,” he said.

Voluntary action has failed

A leaked draft of the regulation, expected to be published today by the commissioner for home affairs Ylva Johansson, argues that although some tech companies use technology to detect, report and remove online child abuse, voluntary measures have failed to eradicate the problem.

“Measures taken by providers vary widely – with the vast majority of reports coming from a handful of providers – and a significant number take no action,” the document states.

The draft regulation introduces targeted detection orders that will legally require communications and social media companies to introduce “automated detection technology” to identify possible child abuse, or risk heavy fines.

Under the proposals, communications and social media companies can be ordered to install technology to compare photographs on their services against databases of known child abuse images.

They may also be required to deploy algorithms that can identify previously unseen child abuse images and to detect possible grooming attempts by scanning the content of personal communications.

The leaked draft regulation acknowledges that scanning emails or text messages to detect grooming is highly intrusive.

But it states that detection technologies for grooming have reached a “high degree of accuracy”, with Microsoft claiming 88% success rates. “The technology used does not ‘understand’ the content of the communications, but rather looks for known, pre-identified patterns that indicate potential grooming,” it says.

Any messages or images that appear to indicate child abuse will be reported for manual review, according to the proposal.

Agency oversight

The leak reveals plans by the European Commission to create an agency, the EU Centre on Child Sexual Abuse (EUCSA), to oversee the policy by 2029.

The EUCSA will have close links to the European police agency, Europol, which will provide the agency with HR, IT, cyber security services and access to experts.

The EUCSA will also play a key role in developing technologies to detect child sexual abuse material (CSAM), which will be made available free of charge to technology companies.

Its role will be to assess reports of abuse from internet and communications services companies generated by algorithms to assess whether they merit further investigation.

EU plans to police child abuse raise fresh fears over encryption and privacy rights

Those that do will be passed to Europol and law enforcement authorities in member states to take action.

Age verification

The draft regulation requires service providers to carry out a risk assessment and introduce technology and operational measures to reduce the risks of online child abuse.

According to leaked documents, this includes introducing age verification and age assessment technologies to identify children using online services.

App stores, which feature on Android and Apple mobile phones and desktop computers, will be required to vet apps that could be used for grooming and to restrict their access to children.

Detection orders

Courts will be able to serve detection orders on tech companies, requiring them to introduce abuse detection technology and other safeguards where there is significant risk of a service being used for abuse.

Providers will not be required to use any specific technology but will have the right to install and operate detection technologies made available by the EUCSA free of charge.

EU member states will have new powers to carry out on-site inspections of tech companies to ensure they are complying with the planned regulation. Employees of tech companies can also be required to disclose any failures.

Courts will be able to temporarily restrict users’ access to any services that fail to comply with the regulation.

Threat to end-to-end encryption

Campaign group Access Now wrote to Ylva Johansson this week urging the European Commission not to use the proposed regulation to undermine end-to-end encryption (E2EE).

The letter follows widespread concern that the draft regulation could effectively mandate client-side scanning (CCS), a technology which automatically scans the contents of messages for illegal content before they are encrypted.

“In circumventing E2EE, client-side scanning enables third parties to discern the contents of any text message or media file. This undermines the rights to privacy, data protection, security and free expression, and violates human rights,” the letter said.

It would also contravene the 2020 Council of the European Union resolution on encryption, which found that encryption plays a key role in protecting individuals, industry and government by ensuring the privacy, confidentiality and integrity of communications and personal data.

Research has shown that client-side scanning results in false positives, wrongly singling out the communications of innocent people for scrutiny by law enforcement, it said.

Rejo Zenger, policy advisor at the Dutch lobbying group Bits for Freedom, wrote in a blog post that the proposed regulations required the mass monitoring of communications and were incompatible with EU law.

“Enforcing that providers of communications services continuously look over the shoulders of their users is simply a ‘general monitoring obligation’. That is contrary to European rules and, sooner or later, European judges will declare such a law invalid,” he said.

Coercive measures

European Digital Rights (EDRi) said the proposed “regulation laying down rules to prevent and combat child sexual abuse” will incentivise service providers to take the most intrusive measures possible to avoid severe legal penalties.

Ella Jakubowska, policy advisor to EDRi, said the commission’s proposals would inevitably lead to tech companies introducing client-side scanning technologies, which would make mobile phones vulnerable to attack from malicious actors – or may incentivise them to abandon encryption entirely.

Once CSS is in widespread use, repressive governments may put service providers under pressure to broaden CSS algorithms to identify political dissidents, protest organisers, whistleblowers, journalists and civil rights defenders.

“The European Commission is opening the door for a vast range of authoritarian surveillance tactics,” she said. “Today, companies will scan our private messages for CSAM content. But once these methods are out there, what’s stopping governments forcing companies to scan for evidence of dissidence or political opposition tomorrow?”

Computer Weekly reported in October 2021 that 14 of the world’s top computer scientists had warned that plans by Apple to introduce client-side scanning to detect child abuse on its products were a complete failure, citing multiple ways that states, malicious actors, or targeted abusers could turn the technology around to cause harm to others or society.

“CSS neither guarantees efficacious crime prevention nor prevents surveillance. The effect is the opposite... CSS, by its nature, creates serious security and privacy risks for all society,” they said in a scientific paper published by Columbia University.

Apple has since postponed its plans.

Read more about the debate on end-to-end encryption