The Regulation to Prevent and Combat Child Sexual Abuse, nicknamed the “Chat Control” law, has ignited a fierce debate across the digital world. Supporters hail it as a vital step to protect children from exploitation online, while opponents warn it could usher in an era of mass surveillance. At the heart of the controversy is a difficult question: can Europe safeguard its children while still abiding by the fundamental right to individual privacy?
Child abuse rises both online and offline
Nowadays, child abuse (or any other kind, really) isn’t just about actual physical contact. The internet is an open window for everyone, including predators, who act a bit everywhere, including websites that are accessed by common people, but also the dark web, where hackers and criminals thrive. Child sexual abuse is a very serious issue that’s been plaguing the internet: it’s not just about photos anymore. It’s about manipulation and bribery, and all this has been made easier with AI, deepfakes, and so many other tools.
Children are more vulnerable than adults, and they, too, have been given access to the digital world, which makes them easier prey. According to a recent report from the Childlight Institute, “300 million children are affected every year by technology-facilitated abuse”.
First proposed in 2022 by then EU Commissioner Ylva Johansson, the Regulation to Prevent and Combat Child Sexual Abuse (CSAR), now widely known as the Chat Control law, aims to better protect children from sexual abuse online. According to data from the Internet Watch Foundation (IWF) from 2024, there were 291,273 reports confirmed to contain child sexual abuse imagery, and 729,696 were considered to be criminal.
Chat Control Law: how the EU plans to protect children
The idea is to facilitate screening of digital communications, even in seemingly private environments protected by end-to-end encryption, like WhatsApp, Telegram, or Signal. How would it work? Authorities would use client-side scanning, which, according to the Electronic Frontier Foundation, “scans for specific content on a device before it’s sent. In practice, Chat Control is chat surveillance and functions by having access to everything on a device with indiscriminate monitoring of everything”.
What governments are saying
This website campaigns against this law and showcases timelines and updated voting information — it’s useful if you’re looking to stay on top of it. On October 21st, 12 members were supporting it, while 9 opposed it, and 6 remained undecided. Germany is one of the countries that’s been voicing its disagreement, along with Poland and Slovenia. The latter’s Digital Transformation Minister, Ksenija Klampfer, considers the proposal to be “problematic” and a “disproportionate intrusion into the privacy of communication”.
Meanwhile, France already publicly advocated for its approval, with former Interior Ministers, Bruno Retailleau and Laurent Nuñez, stating “that police should be able to access private messages sent by French citizens”.
The latest version, presented by Denmark’s representative in October, introduced a few safeguards: “only images and links — not text messages — would be subject to scanning” and “the system would only be activated following a decision by an independent judicial or administrative authority”. The EU didn’t vote on the matter this time, though.
What communications companies are saying
According to Bitcoin, Telegram’s CEO, Pavel Durov, warned people about the dangers of this legislation: “It was set to vote on a law that would force apps to scan every private message, turning everyone’s phone into a spying tool”.
Signal’s founder, Meredith Whittaker, also spoke against the law’s implementation, going as far as saying that Signal would leave the EU market if that ever happened: “Let there be no doubt: we will leave the EU market rather than undermine our privacy guarantees. (…) This proposal — if passed and enforced against us — would require us to make this choice.”
Ethereum’s Vitalik Buterin also spoke against Chat Control. Replying to an entrepreneur on X, he said one shouldn’t “make society secure by making people insecure. We all deserve privacy and security for our private communications without inevitably hackable backdoors”.
This raised (and still does) serious concerns about the fundamental right to individual privacy. Opponents argue that regulations should always weigh both sides: if, on one hand, it helps protect children by monitoring and flagging suspicious content, on the other, it can also be used to surveil people.
Violation of individual privacy: yes or no?
Once in force, the law would make it mandatory for online communications platforms to scan the senders’ messages before they go out, thus preventing any threats from reaching the other end. There are, however, things that don’t seem to add up: for starters, the scanning technology the EU plans to use isn’t 100% accurate and can flag false positives (estimated rate of 50% to 75%).
If platforms like WhatsApp and Signal are required to give governments access to everyone’s private messages, there’s really no way to determine what information they’re collecting, and if they’re actually only looking at suspicious content.
Jurist and activist Patrick Breyer believes the Chat Control law will make it easier for governments to surveil people, even with no apparent justification. And this makes sense. Even if they’re supposed to only flag content related to child abuse, what forbids them stops them from going after other content? People at risk might include journalists and minorities. Breyer also believes that implementing this legislation won’t stop criminals: “This is a Big Brother attack on our private messages — like the post office opening and scanning every letter,” says Breyer. “It won’t stop criminals, but it will flood police with false reports and criminalize young people for consensual sexting. The fate of digital privacy in Europe now rests with just a few undecided governments.”
There are two big ironies to underline here:
– First, the fact that the legislation seems to contradict the path that the EU is taking of becoming a global leader in digital sovereignty, with the implementation of regulations such as the NIS2 Directive, the AI Act, or the European Digital Identity Wallet Regulation. Democratizing the use of digital spaces requires trust, something that’s undeniably linked to encryption. If this scanning method is approved, what’s the guarantee that it doesn’t go beyond chat service providers?
– Secondly, CSAR seems to exclude intelligence, police, and military personnel from being scanned, which shows governments are, indeed, aware of the possibility of surveillance and data leakage. Does that mean that it only matters when it potentially affects state secrets?
What’s the way to go?
It’s not like the EU doesn’t have any measures in place to protect minors. Currently, there are interim norms that allow service providers to voluntarily flag suspicious material. CSAR is supposed to replace said norms.
We can all agree that children should, indeed, be shielded from abuse — both online and offline. Perhaps legislation, like the one that was recently approved in Australia and New Zealand, would go a long way. They’ve restricted social media usage to people above 16 years old, making this one of the strictest related laws. This doesn’t protect younger kids from talking to abusers on communications platforms, but it does shield them from potentially harmful content on social media, like videos inciting suicide and dangerous pranks.
Also, the discussion should go beyond sexually related abuse, as there have also been incidents where kids perpetrated crimes with the online help of adults, for example. According to CBS News, there have been at least six lawsuits filed in the USA against the popular gaming platform, Roblox, “on behalf of children who were allegedly groomed or exploited by adults on the gaming platform”.
Can Web3 play a role in this?
When it comes to Web3-based chat and communication apps, the type of software they’re built on may not matter, since the legislation applies to any provider offering those services. Web3-based apps might be useful to fight against such legislation and the scanning methods themselves, but it might also backfire: authorities might make those companies’ lives hard by imposing sanctions or penalties.
Perhaps Web3 could be used in another way: to apply encryption and smart contract principles to scan for suspicious content across chats without actually going into people’s phone conversations. Maybe it doesn’t have to be one or the other. Maybe there’s a way to help protect children from sexual abuse online without going through what each one writes to friends and family online.
• • •
About Integritee
Integritee is the most scalable, privacy-enabling network with a Parachain on Kusama and Polkadot. Our SDK solution combines the security and trust of Polkadot, the scalability of second-layer Sidechains, and the confidentiality of Trusted Execution Environments (TEE), special-purpose hardware based on Intel Software Guard Extensions (SGX) technology, inside which computations run securely, confidentially, and verifiably.
Community & Social Media:
Join Integritee on Discord | Telegram | Twitter | Medium | Youtube | LinkedIn | Website
Products:
L2 Sidechains | Trusted Off-chain Workers | Teeracle | Attesteer | Securitee | Incognitee
Integritee Network:
Governance | Explorer | Mainnet | Github
