The European Parliament finally reached an agreement on the Commission’s controversial ‘Chat Control’ proposal—originally intended as a mass surveillance instrument and criticized as the “end of privacy in digital correspondence”—changing it to a “privacy friendly” piece of legislation, the Parliament’s civil liberties (LIBE) committee announced in a press conference on Thursday, October 26th.
However, the position has been reached just as Home Affairs Commissioner Ylva Johansson, responsible for drafting the file, is caught up in an extensive conflict of interest scandal, failing to dispel MEPs’ concerns during her parliamentary hearing on Wednesday, October 25th.
As we also reported extensively, the official purpose of the Commission’s ‘Child Sexual Abuse Regulation’ (CSAR) proposal was to curb the dissemination of child pornography by automatically monitoring all digital correspondence of European citizens with the use of AI technology. Messages and pictures would be scanned for certain keywords, then flagged and sent to a central database for further inquiry and, if needed, prosecution.
Originally, the Commission aimed to make the regulation mandatory for all popular email and messaging apps, including those with more advanced end-to-end encryption (such as WhatsApp, Signal, or Telegram). They even wanted to require the service providers to introduce age verification systems that would effectively exclude any anonymous use. Appearing noble in purpose, the call for mandatory searches nevertheless prompted critics all over Europe to label it as an outrageous attack on fundamental rights to privacy.
A balanced approach
After months of hard-pressed negotiations, “the spirit of compromise has prevailed” in the LIBE committee, Javier Zarzalejos (EPP), the rapporteur on the file, said on Thursday.
“There is no massive scanning or general monitoring of the web, no indiscriminate scanning of private communications, and no backdoors to weaken encryption,” Zarzalejos said. He explained that the Parliament limited the detection order to specific users who are already under suspicion while removing end-to-end encrypted services from the scope of the measures, as well as all text messages, and even dropping mandatory age-verification systems.
The Commission’s original proposal was “highly divisive,” even among child protection agencies and survivors of child abuse, added MEP Patrick Breyer, a member of the European Pirate Party, whom The European Conservative interviewed about Chat Control before. “We decided to go for a new and consensual approach by removing the contested and problematic points, … and instead added more effective, code-proof, and rights-respecting measures.”
The MEPs also urged the Council to move forward with its negotiations as well, although they recognized that it will not be easy since member states still have the Commission’s original proposal on their table. Nonetheless, the Parliament is expected to confirm its position with a plenary vote next month and hopes to finalize the law by early next year.
Transparency at its best
As the parliamentary negotiations were ongoing, the Commission’s own position was undermined by an investigation published last month in Balkan Insight, stirring up an ever-growing scandal—dubbed ‘Chat Control Gate’—about a series of high-level collusions behind the scenes. This prompted MEPs to call Home Affairs Commissioner Johansson, the official directly in charge of the drafting Chat Control, for cross-examination in the Parliament.
“We have been transparent from the beginning,” Johansson insisted repeatedly during the hearing on Wednesday, barely giving any direct answer to the MEPs about the allegations brought forward. Instead, she stressed several times that it was important to remember that the legislation is about the protection of children.
But the evidence, says Chat Control, was about anything but children. Based on documents finally released upon freedom of information requests, leaked correspondence, and dozens of interviews, the investigation “connects the dots between the key actors bankrolling and organizing the advocacy campaign in favor of Johansson’s proposal and their direct links with the commissioner and her cabinet,” Balkan Insight wrote.
It’s a synthesis that granted certain stakeholders, AI firms, and advocacy groups—which enjoy significant financial backing—a questionable level of influence over the crafting of EU policy.
During the hearing, Apostolis Fotiadis, one of the journalists behind the investigation, told MEPs that he had to submit a complaint to the European Ombudsman after his requests for the documents remained unanswered for months, and some still remain undisclosed by the Commission regardless.
Furthermore, “Since the investigation was published, we experienced repeated attempts by the Commissioner [Johansson] to discredit it, characterizing it as ‘a collection of insinuations and conspiracy theories looking for a home,’” Fotiades added. “It’s obvious that since no empirical fact in the investigation could be questioned, the decision was to discredit the investigation as a whole by questioning its foundations and its intentions.”
When asked by MEPs why she would not take the journalists to court if she describes the allegations as “disinformation,” Johansson said that would be unthinkable because, ironically, “we need a public debate.”
“A web of influence”
The available documents indicate that the Commission was engaged in discussion with very peculiar stakeholders during the proposal consulting period, many of which are child protection NGOs only in name but in fact make money by selling user-scanning AI technology.
Furthermore, one particular organization, an allegedly independent foundation called the WeProtect Global Alliance, appears to be in the center of the elaborate lobby network around the file. WeProtect, which was co-founded by the EU, the U.S., and the UK, received nearly €1 million in EU funds between 2020 and 2023 as a “central organization for coordinating and streamlining global efforts and regulatory improvements” in fighting child exploitation online.
In short, WeProtect’s job is to bring together and lobby dozens of stakeholders interested in getting this proposal passed, including governments, NGOs, tech firms, and law enforcement organizations—all of which have members sitting on the foundation’s board of directors.
But the most interesting member of the board is Labrador Jimenez, a Commission official, “who played a central role in drafting and promoting Johansson’s regulation, the same proposal that WeProtect is actively campaigning for with EU funding,” Balkan Insight wrote.
After being confronted with all this in Parliament, the Commissioner only repeated that everything was done in line with the regulations and that WeProtect is a “special case” because it’s an EU-founded organization—ignoring how the ‘foundation’ appears to serve as a cash-for-lobby hub for half a dozen other stakeholders who were not founded by the EU.
“So far, we don’t have clear answers on the allegations concerning [the] privileged access of AI companies and other stakeholders to the commissioner and staff drafting the regulation,” Green MEP Saskia Bricmont said, clearly annoyed after several rounds of non-answers from Johansson.
“Commercial interests or economic revenues should never govern and orient the decision-making,” Bricmont continued. “Particularly when it comes to influencing such sensitive legislation, that very vulnerable victims are concerned, and that other organizations having tried many times to warn you about the adverse effects of your proposal seem to have not been listened to as much in this case.”
Indeed, while several tech firms and lobby groups had preferential access to Johansson for most of the past three years, the Commission’s impact assessment largely ignored any scientific input from outside, despite cryptographers and security analysts warning countless times about the dangers of meddling with encrypted communication.
However, Johansson insists that she did listen to stakeholders with contrary opinions, but she chose to discard them because they proposed using other methods than scanning private messages. “That will not address the problem; that is part of the problem,” she told MEPs. “One way or another, this illegal content has to be detected.”
Microtargeting and shadowbans
Furthermore, when it became clear that a lot of EU member states remained skeptical of the proposal in the Council, the Commission launched a Twitter/X ad campaign specifically targeting those countries, using misleading language, emotional blackmail, and deliberately skewed statistics showing high public support behind Chat Control, hiding drawbacks and objective polling numbers that would shatter the Commission’s narrative.
The controversial campaign, which was pulled a day after a Dutch journalist sent an access-to-document request and is now under investigation for possible breach of ethical conduct, also employed “microtargeting” to ensure people who care about privacy (those following Julian Assange, for example), Eurosceptics (following Nigel Farage or Viktor Orbán), or even Christians (for no apparent reason) are excluded from its reach.
Incidentally, the journalist, Danny Mekic, was suspiciously shadowbanned on X right after exposing the campaign. “I have no idea why [X] made this decision,” Johansson told MEPs after they repeatedly asked if she had a hand in that, suggesting they should ask Elon Musk instead.
The MEPs, however, were not impressed. “Let me congratulate you on your job as a monitoring influencer on the internet, using targeted disinformation, and putting pressure on reluctant member states,” Patrick Breyer commented, before asking Johansson bluntly, “Don’t you have any respect for democracy and the legislative procedure?”
“License to print money”
Needless to say, Johansson did not convince anyone in the room.
“The commissioner refused to give precise answers, which is unacceptable in her position,” French MEP Patricia Chagnon (ID) told The European Conservative after the hearing. “In politics, transparency is a moral duty,” she said. “The Commission is paid from public money, its dealings must also be public.”
For now, doubts about Johansson’s intentions with Chat Control will persist, but they won’t jeopardize the Parliament’s much more privacy-friendly position to progress further toward becoming law, the MEPs on Thursday insisted.
“About the procurement of software, … we submitted some questions to Ms. Johansson yesterday about who exactly we will purchase it from,” MEP Cornelia Ernst (The Left) said. “I mean, this is a license to print money, practically,” she added, but calling for total transparency before any decision.