The European Commission may have formally withdrawn the controversial Chat Control proposal, but the most ambitious digital surveillance project in the history of the European Union remains very much alive. Under the Danish presidency of the Council, the plan has returned under a new guise of “voluntariness”: governments or platforms that wish to do so will be able to apply message-scanning systems in the name of “risk mitigation.” For privacy experts, this formula amounts to nothing less than opening the back door to mass surveillance in Europe.
Chat Control was originally presented as a measure to combat child abuse online. The idea: to mandate that messaging platforms—from WhatsApp and Telegram to Signal or Messenger—automatically scan all user messages, images, and videos for illegal content. Opposition was swift. Thousands of lawyers, engineers, and MEPs warned that such a measure violated the right to privacy, the presumption of innocence, and the very principle of proportionality enshrined in the EU Charter of Fundamental Rights.
Due to the lack of political consensus, the proposal was officially withdrawn this autumn. Yet the revised text introduced by Denmark keeps the same spirit intact. Under the concept of “voluntary implementation” and “risk mitigation measures,” platforms are given the option to activate scanning tools “to protect users.” In practice, this creates a legal loophole that allows private communications to be monitored without a judicial warrant.
In technical terms, this means introducing a backdoor: a mechanism that allows messages to be read or analyzed before they are encrypted. End-to-end encryption is designed so that only the sender and the recipient can read what is sent. Pre-encryption scanning cancels that protection entirely.
Criminals won’t be the ones affected
Beyond the political debate, the real problem is technical. The criminals this law claims to target will remain untouched. Anyone with basic computer skills can easily bypass these systems by using VPNs, TOR networks, or alternative messaging services outside the EU’s jurisdiction.
Chat Control does not target criminals—it targets the average citizen: those with little or no technical knowledge who use the most common apps and trust the system to protect them. In reality, the system will be watching them.
Artificial intelligence tools used for automated message scanning also suffer from high error rates—between 5% and 30% false positives, especially when analyzing images and videos. This means millions of legitimate conversations could be flagged, blocked, or even reviewed by mistake. And once such an infrastructure of surveillance exists, nothing prevents it from being expanded to other purposes: political control, censorship, or the simple mass collection of data.
Supporters of the measure insist it’s about “security.” But breaking encryption is like leaving your front door half-open and trusting that only “the good guys” will come in. Every backdoor or scanning system introduces new vulnerabilities that can be exploited by hackers, mafias, or foreign governments. In the name of security, the EU is undermining its own digital ecosystem.
Moreover, this so-called “voluntary” approach creates an additional risk: it encourages platforms to compete to appear “secure” by scanning even more, effectively privatizing surveillance. Under political and media pressure, private companies end up assuming functions that properly belong to the judiciary.
A tool against the ordinary citizen
At its core, Chat Control was never about child protection—it was about power. A system of mass supervision presented as a moral shield but designed to monitor the behavior of millions of citizens.
The comparison with gun control is inevitable. Tougher laws rarely deter determined killers; they only restrict those who follow the law. The same applies in the digital realm: those who want to commit crimes will continue doing so, while ordinary citizens lose their privacy, freedom, and trust in the system.
The real goal is not to stop crime but to tame the user—to keep everyone under constant observation “for the common good,” until people think twice before expressing an uncomfortable opinion. The question is no longer whether the European Union will adopt a law of mass surveillance, but when—and under what pretext.


