EU push for child safety stalls as ePrivacy derogation expires, age verification program hacked and CSA Regulation stuck in trilogue



Summary: Europe’s efforts to protect children online have clashed with its privacy architecture. The ePrivacy derogation allowing voluntary CSAM scanning expired on April 3 after Parliament rejected its extension by a vote of 311-228, the EU’s new age verification application announced on April 15 was hacked in less than two minutes, and the CSA Regulation (“Conversation Control”) remains in trilogue with a July deadline. The ECtHR has ruled that encryption backdoors violate fundamental rights, while GDPR, DSA and the proposed CSA Regulation each require knowing whether a user is a child, which itself requires the collection of data that privacy law cannot collect about children.

On April 3, the European Parliament voted 228 to 311 to end the temporary ban on ePrivacy. The exemption allowed platforms such as Meta, Google and Microsoft to voluntarily scan private messages for child sexual abuse material without breaching EU privacy law. When it expires, those scans no longer have any legal basis. Twelve days later, the European Commission announced a new privacy-preserving age verification program designed to protect children online. Researchers cracked it in less than two minutes. Between the outdated law and the broken software lies the whole problem: Europe wants to protect children from online exploitation, but every tool it develops to do so is embedded in the privacy architecture it has built over a decade. The result is a regulatory system at war with itself, with the mechanisms needed to find abused children requiring the exact collection of data that EU law cannot collect on children.

Crawl space

ePrivacy circumvention was introduced as a break in 2021. The European Commission has proposed the Child Sexual Exploitation Regulation, known formally as the CSA Regulation and informally as Chat Surveillance, which would mandate platforms to detect and report CSAM in private messages, including end-to-end encrypted messages. The regulation was supposed to replace the voluntary framework within three years. It didn’t happen. Trilogue talks between the Parliament, the Council and the Commission have been running since 2022, with the next meeting on May 4, with the goal of reaching a political agreement by July. Meanwhile, the grace period has expired. The National Center for Missing and Exploited Children in the US, which processes the majority of global CSAM reports, warned that this would lead to a measurable drop in referrals from European platforms. Meta has confirmed that it has stopped voluntary scanning in the EU. Parliament’s position is that the derogation is incompatible with the fundamental right to privacy of communication. The position of child safety organizations is that Parliament has simply made it legal for platforms to ignore abusive material on their systems.

The CSA Regulation proposed by the Commission will require platforms to use detection commands issued by the new EU Center to scan messages for known CSAM, new CSAM and maintenance behavior. Parliament eliminated the most controversial elements: it rejected the scanning of end-to-end encrypted messages, limited detection of known material using hash matching technology, and ruled out real-time communication. The Council, which is run by a rotating presidency that further tightens law enforcement access, wants broader scanning powers for undisclosed material and maintenance. The distance between the two positions is not a detail to discuss. This is a major disagreement over whether private communications can be systematically monitored to protect children, and the European Court of Human Rights has already shown where it lies.

Encryption wall

In February, the ECtHR ruled in Podchasov v. Russia that requiring platforms to weaken or backdoor end-to-end encryption violates Article 8 of the European Convention on Human Rights, the right to respect for private life and correspondence. The decision focused on a Russian law that compels messaging services to provide decryption keys to the FSB, but its rationale applies directly to the discovery orders proposed by the CSA Regulation. If a platform cannot scan encrypted messages without weakening the encryption, and if weakening the encryption violates fundamental rights, then the regulation cannot mandate what the authors of the regulation intended to mandate it. Signal’s president, Meredith Whittaker, said he would leave the EU rather than comply with any law requiring the organization to break its encryption protocol. Apple has removed Advanced Data Protection for users in the United Kingdom after the British government issued a notice under the Investigatory Powers Act about technical capabilities that would allow backdoor access to iCloud data. The encryption debate is no longer theoretical. Companies already make jurisdictional decisions based on where governments request access to private communications.

The European Data Protection Board and the European Data Protection Supervisor have both warned that the CSA Regulation drafted by the Commission is disproportionate and incompatible with EU fundamental rights. The EDPS specifically noted that the proposed client-side scanning technique, as an alternative to breaking encryption by scanning the content on the device before encryption, still constitutes mass surveillance because it processes each message to identify illegal ones. The distinction between pre-encryption scanning and post-encryption scanning is technically meaningful, but legally irrelevant if the result is that every private message is analyzed by an automated system. The position of the Parliament in the negotiations reflects this analysis. There is no council.

The age verification paradox

While the CSA Regulations stand, individual member states have moved forward with age-based restrictions. France prohibits children under 15 from accessing social networks without parental consent. Spain has set the limit at 16. Greece to ban social media for under 15s From 2027. Austria’s limit is 14. Norway plans to ban social media for under-16s and developing a national age verification system to implement it. Europe accelerates age restrictions on social networks has created a patchwork of national laws with no common enforcement mechanism, a problem that the EU’s age verification program must address.

The commission’s program, announced on April 15, is designed to verify a user’s age with a zero-knowledge system that confirms someone is over a given age limit without revealing the user’s age to the platform, or passing on their date of birth, name or any other personal information. It was introduced as a technical solution to the paradox of verifying age without collecting age data. Security researchers have demonstrated that the app can bypass the verification process within two minutes of launch, undermining the credibility of a tool the Commission has proposed as proof that a privacy-preserving child safety app is technically feasible. The EU’s new privacy-preserving age verification program It was intended to demonstrate that the conflict between child protection and data minimization could be resolved through engineering. His immediate failure demonstrated otherwise.

Legal conflict

The Digital Services Law, which comes into full force in 2024, requires platforms under Article 28 to assess and mitigate systemic risks to minors, including exposure to harmful content, manipulation through interface design, and processing of personal data by exploiting children’s vulnerabilities. The DSA’s guidelines instruct platforms to implement age-appropriate protections, but do not specify how platforms must determine a user’s age. The GDPR sets the age of digital consent at 16, with member states allowed to lower it to 13, and parental consent is required for processing children’s data below this threshold. GDPR fines increasingly target child data breachesRegulators in Europe treat children’s privacy as an enforcement priority. But in order to apply age protections, platforms must first determine who is a child, and determining who is a child requires collecting or inferring personal information about each user, including adults who have the right not to be age-verified.

This is the circularity at the heart of the European child safety framework. GDPR says you cannot process children’s data without enhanced protection. The DSA says you should protect children from harmful content. The CSA Rule says you must detect abusive material in private messages. Each commit requires knowing whether a given user is a child. Knowing whether a given user is a child requires the processing of his personal data. Processing their personal data to determine their age may violate the data minimization principles set out in the GDPR. An age verification app should have cut this knot. It was broken when it arrived. The Electronic Privacy Waiver was supposed to buy time for the CSA Rules. Expired without replacement. The CSA Regulation was supposed to create a harmonized framework. It is stuck between a Parliament that will not accept mass surveillance and a Council that will not accept regulation without scanning powers.

July target

Trilogue negotiators have set July as the deadline for a political agreement on the CSA Rules. The compromise proposals circulated in Brussels would limit mandatory detection to unencrypted platforms and known CSAMs that use hash-matching, with a review clause that could expand the scope if the technology improves. Encrypted platforms will face an obligation to report when CSAM is detected through user reporting or metadata analysis, rather than through content scanning. The EU Center for the Prevention of Child Sexual Exploitation will coordinate cross-border applications and maintain hash databases. It is uncertain whether this compromise will occur. Law enforcement agencies in Europe have lobbied hard for wider scanning, arguing that encrypted messaging is a key distribution channel for abusive material and that its exclusion makes regulation largely symbolic. Privacy advocates argue that once any mandatory scanning infrastructure is built, it will inevitably expand to other categories of illegal content, a slippery slope that the ECtHR’s ruling in Podčasov was designed to prevent.

An honest assessment is that Europe has not resolved the tension between child safety and privacy, as the tension cannot be resolved through regulation alone. Childproofing tools, scanning messages for abusive material, verifying ages before granting access, monitoring interactions for grooming patterns all require surveillance capabilities available to circumvent EU legislation. Member states that have acted unilaterally with age bans have done so without a credible enforcement mechanism. The commission’s age-verification technology failed its first public test. Parliament killed a legal document that would have allowed voluntary scanning. And the regulation that’s supposed to replace them all remains a document that no one can agree on after four years of negotiations, because the two things it’s trying to protect, the safety of children and everyone’s privacy, require opposite things from the same infrastructure.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *