European Ombudsman Emily O’Reilly confirmed that her office will investigate a potential conflict of interest involving Europol and a U.S.-based non-profit that coordinates software developers under the umbrella of campaigning against child sexual exploitation.
The group’s positioning as “Thorn: Digital Defenders of Children” means that it can expect to benefit from its role in shaping forthcoming EU laws on child sexual exploitation, according to an article in Euractiv on Monday, January 8th. Thorn has already been involved in high-level EU consulting, going back as far as at least 2020. Critics such as Arda Gerkens—the director of EOKM, Europe’s oldest hotline for reporting online child abuse—have described it as a tech company ‘pretending’ to be an NGO.
Among the ombudsman’s probe’s targets are two former Europol officials who now work for Thorn, both under scrutiny following an official complaint submitted by MEP Patrick Breyer of the European Pirate Party (interviewed here). The ombudsman has opened an inquiry into the Europol-Thorn connection, while waiting for the European Commission to disclose its own communications with the organization.
When the European Commission invited this celebrity-led primary stakeholder to advise on forthcoming legislation, one proposed technical solution was deploying artificial intelligence (AI) to scan every European user’s digital correspondence, looking for child sexual exploitation material (CSEM) through an immense surveillance system. Claims of maladministration arose because of the identity of a key supplier of such technology: Thorn itself.
Europol also has its rationale for backing the legislation. Freedom of information requests revealed that the agency was secretly pushing for unlimited access to all personal data harvested through the scanning process. Moreover, it lobbied for no regulatory boundaries to be set on how the data is used—meaning it could have used it to detect all other types of crimes as well as training its own AI algorithms.
Opponents of the project include data watchdogs, who nicknamed the technology “Chat Control” and warned of “the end of privacy” in Europe. In response, the European Parliament decided to dilute the legislation, seeking some balance between child protection and the fundamental right to privacy, although vigilance remains necessary.
Pressure grew further after investigative journalists at Balkan Insight stumbled upon signs of a major conflict of interest.
In a series of lengthy analyses, the journal drew on the slim proportion of the documents disclosed after months of delays to describe a shadowy network around the Commission “that granted certain stakeholders, AI firms, and advocacy groups—which enjoy significant financial backing—a questionable level of influence over the crafting of EU policy.”
The Commission still refuses to release several key documents related to one of the biggest stakeholders, saying that “disclosure would undermine the commercial interests of the organization.”
The ombudsman’s inquiry will delve into the strong bonds between Thorn, Europol, and the Commission.
“As a first step, I have decided that it is necessary to inspect certain documents held by Europol related to these post-service activities. I expect to receive these documents by 15 January 2024,” O’Reilly wrote to Breyer, promising to deliver updates on any progress.
Forcing the EU to treat Thorn as an AI supplier—not an NGO—would be a step in the right direction.