‘Chat Control’—the first draft report on the EU’s upcoming Child Sexual Abuse Regulation (CSAR)—was followed by a lively debate on Wednesday, April 26th, by the members of the civil liberties-focused LIBE committee. Their concern was whether the proposal strikes the right balance between effectively preventing the online dissemination of illegal material linked to child sexual abuse and respecting the fundamental right to privacy.
While the European Peoples’ Party rapporteur tried to reassure his colleagues that the legislation would not breach personal privacy rights, some of the MEPs remained unconvinced. Furthermore, those who spoke out against Chat Control not only criticized its ability to protect data privacy but also questioned its efficiency in achieving what it set out to accomplish.
The Commission’s original proposal indicated that the purpose of the CSAR was to curb the dissemination of child sexual abuse material (CSEM) by automatically monitoring all digital correspondence of European citizens. Messages and pictures would be scanned for certain keywords, then flagged and sent to a central database for further inquiry and, if needed, prosecution. By making it mandatory for all email and messaging apps (including WhatsApp, Signal, and Telegram), the proposal would essentially dissolve all anonymous use.
The members of the European Parliament are in complete agreement that online child sexual exploitation is a real issue that must be addressed effectively, but opinions on what approach to take differ widely.
As a step in the right direction, last week’s report introduced a series of amendments to the proposal, some of which were clearly aimed at addressing the more problematic privacy concerns. One of the amendments recommends limiting the scope of detection orders to certain segments of each service, such as specific chat rooms.
According to critics, however, this is still not enough to prevent innocent people from being monitored, and the system being flooded with false positive reports. Another amendment makes the proposal even more questionable by recommending that users remain unnotified when they are flagged to be monitored under the suspicion of being linked to CSEM.
Nothing to Worry About
MEP Javier Zarzalejos (EPP/Spain), the rapporteur of the file, opened the debate by claiming it is irresponsible to even challenge this legislation by framing it as anything other than an honest attempt to help children.
“Labelling this proposal as Chat Control is a regrettable way to play down the importance of our debate,” Zarzalejos said, adding, “I think it’s seriously misleading and it’s the irresponsible distortion of all this—that’s what this is about.”
Regarding the privacy concerns raised before, Zarzalejos tried to reassure his colleagues that the legislation runs no risk whatsoever of violating the privacy of the general population. As he explained:
There is no one-size-fits-all approach … but nuanced, specific mitigation measures for each service, using technologies that will have to be cleared by the competent authorities. Technologies that will operate without having access to the content of the communication, and without weakening the integrity of the encryption.
The detection orders and mitigation measures are limited in time and decided upon by the relevant judicial authority [while also being] limited to an identifiable part or component of a service, such as a specific channel of communication or a specific group of users.
After adding that CSAR has the opportunity to set global standards in combating CSEM worldwide, as has happened many times in EU legislative history, Zarzalejos finished his speech by saying that
What is illegal offline, shall be illegal online too. And I think this is the underlying duty [of the Parliament], to make the internet a safe space for children.
Alternative Views
While most MEPs wholeheartedly welcomed the proposal, some expressed serious dissatisfaction with where the text stands now and presented alternative solutions and recommendations.
MEP Cornelia Ernst (The Left/Germany), said that Parliament should make sure that the amendments designed to protect data privacy “are not only cosmetics” and issued heavy criticism against the idea that users would not be notified if they fell under surveillance.
Moritz Körner, a German MEP from Renew, also spoke out against the potential privacy abuses and drew attention to a major pitfall of the proposal which could seriously hinder its overall efficiency. Körner explained:
A study we commissioned shows that proposal will not be effective to prevent child sexual abuse because the technology used to detect grooming and abusive materials is not good enough. It is going to overwhelm authorities with thousands of false positive reports every single day, actually [leading to] fewer [meaningful] investigations.
The most serious criticism came from Green MEP Patrick Breyer, who began by saying that Chat Control is “unprecedented in the free world,” since it manages to divide every group of stakeholders, including child protection organisations, political groups, and even the victims themselves. There are some at every level who would like to implement as much of it as possible and others who would rather see none of it pass, not only concerned about the mass surveillance aspect but genuinely thinking it will harm children.
Therefore, Breyer called for focusing on getting everyone on the same page first, “keeping only the parts of the proposal on what we all agree on and consensually adding new, meaningful approaches” to be able to truly protect children.
But to do that, a whole new approach is needed, primarily in finding ways to leave the privacy of innocent Europeans untouched while also making the whole process more efficient.
We need to strictly limit the problematic scanning orders to persons presumably linked to CSEM. That’s the only way to avoid annulment in court and achieving nothing at all for children.
Generally, the draft report goes in the right direction, Breyer said, but it fails to implement the need to exempt persons who have nothing at all to do with child sexual exploitation, as most independent experts recommended. The same goes for “turning our personal devices into scanners,” as well as the proposed metadata collection, which even the Commission recognized as not suitable for identifying CSEM.
Instead, Breyer recommended a different path, addressing the issue at its roots and not the end user, making the services “safe by design.”
Rather than trying and failing to block material via access providers or search engines, we should make it mandatory to remove CSEM at its source … It’s hard to believe that the proposal actually fails to mandate removal.
“This is not an easy issue,” Zarzalejos admitted in the end, expressing gratitude for the constructive criticism. He also reassured the MEPs that he and the shadow rapporteurs will take all the discussed points into account in further negotiations, trying to strike a balance between protecting children and putting in all the necessary privacy safeguards. In theory, he was even open to the suggestion of further narrowing down the scope of detection orders.
Now all that remains is to wait for further amendment proposals from each party, the deadline for which is May 17th. If an agreement is reached in the committee after that the legislation would go before the plenary by September and be finalised in the Council by the end of the year.