Outgoing EU Ombudsman Emily O’Reilly has ruled that the way Europol handled the move of a former official to an AI company that’s heavily involved with the drafting of the EU’s digital surveillance tool dubbed by critics as ‘Chat Control’ constitutes “maladministration”—EU-speak for ‘guilty.’
The ombudsman’s investigation was launched in response to a complaint submitted by former Pirate Party MEP, Patrick Breyer in October 2023, the EU Parliament’s most prominent critic of the proposal.
The probe targeted two former Europol officials who moved to the American lobby group Thorn (NGO in name only), which develops and sells its AI products to governments. Thorn was one of the primary ‘stakeholders’ influencing the EU Commission’s legislative work on the so-called ‘Child Sexual Abuse Regulation’ (CSAR), or Chat Control, which proposed mandatory surveillance of all, even encrypted digital correspondence in the EU to detect child sexual exploitation material (CSEM).
The ruling from the ombudsman details that in the case of at least one of the former staffers, Europol failed to take the necessary precautions to avoid conflicts of interest. The official, Cathal Delaney, worked on Europol’s own AI pilot project on CSEM detection right until moving to Thorn, a registered lobbyist, and went on to present its product in meetings with Europol and the EU Commission.
“When a former Europol employee sells their internal knowledge and contacts for the purpose of lobbying personally known EU Commission staff, this is exactly what must be prevented,” Breyer commented on the ruling.
It’s easy to see why Thorn has “billions of reasons” to push for the adoption of the law, but Europol also has its own rationale. Freedom of information requests revealed that the agency was secretly pushing for unlimited access to all personal data harvested through the scanning process. Moreover, it lobbied for no regulatory boundaries to be set on how the data is used—meaning it could have used it to detect all other types of crimes as well as training its own AI algorithms.
The scandal, dubbed ‘Chat Control Gate’ goes far deeper than that, however. A separate probe was opened by the EU Parliament against former Home Affairs Commissioner Ylva Johansson and her team over a shadowy network of EU-funded NGOs “that granted certain stakeholders, AI firms, and advocacy groups—which enjoy significant financial backing—a questionable level of influence over the crafting” of the original Chat Control legislation.
“Since the revelation of ‘Chat Control Gate,’ we know that the EU’s chat control proposal is ultimately a product of lobbying by an international surveillance-industrial complex. To ensure this never happens again, the surveillance lobbying swamp must be drained,” Breyer explained.
What’s more, the EU Commission was also caught in a ‘micro-targeting’ scandal over illegally advertising Chat Control to select groups of citizens in certain member states in the hope that it would put pressure on their governments to adopt the legislation in the EU Council, and then silencing and intimidating journalists who brought the highly unethical practices into public light.
The ombudsman was unable to investigate the full extent of this scandal because the Commission still withholds several key documents to protect the “commercial interests” of the parties involved. Nonetheless, the scandal was enough in late 2023 for the EU Parliament to significantly water down its own version of Chat Control, dropping end-to-end encrypted messages and multimedia (like pictures and videos) from its scope, proposing targeted scanning instead of blanket surveillance of all citizens, and also removing the mandatory age-verification from the draft.
But this one breakthrough in the Parliament doesn’t mean the digital privacy of EU citizens is safe. The final legislation will depend on the interinstitutional negotiations with the Commission and Council, and the latter still hasn’t been able to agree on its draft document. Currently, there’s no consensus among member states neither to remove encrypted message providers (such as WhatsApp or Signal) nor to remove blanket surveillance in favor of targeting only those who are suspected of possessing child abuse material.