Currently Reading

EU Agrees on Digital Services Act by Bridget Ryder

4 minute read

Read Previous

Rebuilding Our Roots: An Interview with Victor Aubert by David Engels

Netflix in Decline by Hélène de Lauzun

Read Next


EU Agrees on Digital Services Act

The European Union has finally reached an agreement on the long-awaited Digital Services Act. The European Commission announced that the council of ministers and the European Parliament had made a political agreement on the legislation following drawn-out negotiations that ended in the early hours of April 23rd. 

The agreed-upon text will still have to be formally passed in both the council and Parliament, but the stage is set for new single-market legislation that will affect the practices of Google, Facebook, Twitter, Amazon, as well as smaller internet platforms and other operators within the internet landscape. 

It is expected to come into force by the end of the year.

“The DSA [Digital Services Act] will upgrade the ground-rules for all online services in the EU. It will ensure that the online environment remains a safe space, safeguarding freedom of expression and opportunities for digital businesses. It gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms,” EU Commission President Ursula Von der Leyen said in a statement.

The DSA will “apply to all digital services that connect consumers to goods, services, or content,” and includes “new procedures for faster removal of illegal content as well as comprehensive protection for users’ fundamental rights online,” the European Commission summarised the draft law in a press release.

It regulates “intermediate services,” including internet providers and domain name registrars, and “hosting services,” including cloud services and web hosting. Social media, app stores, and online marketplaces are categorised as “online platforms” and divided by size. Very Large Online Platforms (VLOP) are those that reach 10% of 450 million consumers in Europe. Micro and small enterprises are those with less than 45 million monthly active users in the EU. All other platforms fall in between these parameters. Search engines are also included in the scope of the law.

One of the key elements of the DSA is the additional requirements for VLOPs. The likes of Google, Facebook, and Amazon will have to conduct an annual risk assessment analysis; failure to comply will be punishable by hefty fines. Euractiv reports that the assessments have to include the potential risk for disinformation, deceptive content, and revenge porn. The platforms must also show that they have implemented mitigation measures. They will also be subject to independent audits.

During the negotiations, a crisis mechanism was added for “the context of the Russian aggression in Ukraine and the particular impact on the manipulation of online information,” according to the European Commission. When triggered, it gives the EU power to analyse the impact of the activities of VLOPs and very large search engines on the crisis in question, and to decide on “proportionate and effective measures to be put in place for the respect of fundamental rights.” It can be activated by the European Commission on the recommendation of the board of national digital services coordinators with a simple majority vote. 

The crisis status automatically expires after three months and requires the Commission to report to Parliament and the Council on any actions taken during the crisis.

The draft of the new rules also delineates a process for flagging and removing illegal content. It creates the position of ‘trusted flaggers,’ who are essentially specialised EU bureaucrats. Platforms will have to provide a mechanism for users to flag illegal content and then work with the ‘trusted flaggers’ to follow-up on the user notifications. 

Additionally, platforms will have to allow users to challenge their “content moderation decisions,” if they involve removing or blocking content or users.

Platforms will also have to be more transparent about the algorithms they use to give users recommendations, explaining how they personalise content. VLOPs will have to provide a recommendation process not based on profiling users. 

Very large online marketplaces will have to take additional steps to verify and provide traceability for the traders selling through them. 

To ensure that search engines don’t slip through the cracks, the final text provides case-by-case assessment of the responsibilities of Google and other search engines for illegal content, which is then to be clarified by a legal review. 

The final agreement also includes a ban on dark patterns: website interfaces that cause users to interact with a website in a way that isn’t what they intended, such as clicking through a page or adding items to a virtual cart. The DSA also gives consumers the right to seek compensation for damages from online marketplaces caused by dark patterns or a platforms’ lack of diligence in verifying traders.

Under the DSA, both individual countries and the EU will have supervisory and enforcement powers for online platforms. National authorities will supervise smaller platforms and the commission will oversee large online platforms. To finance this added role, the EU will charge the platforms a supervisory fee proportional to the size of the service but not exceeding 0.05% of its worldwide annual net income.

Micro and small enterprises are exempted from many obligations, such as the traceability of traders, notification of criminal offences, transparency requirements, and providing a system for handling complaints. If they grow to medium-sized companies, they will have a year grace period before additional regulations apply. 

Predictably, big tech companies have said nothing in reaction to the new regulations. 

Bridget Ryder is Spain-based writer. She has written on politics, environment, and culture for American and international publications. She holds degrees in Spanish and Catholic Studies.