Skip to content
No. 012 ยท Analysis
EUROPEAN POLITICS ANALYSIS

Europe's Disinformation Problem Has No Easy Fix

Two years after the EU's flagship regulation took effect, the information environment is more fragmented than ever.

The Digital Services Act was supposed to be Europe's answer to the disinformation crisis. Two years after its major provisions entered into force, a clear-eyed assessment is warranted.

What the DSA has achieved

The DSA has established a legal framework treating large platforms as infrastructure subject to public interest obligations. Mandatory risk assessments, researcher data access, and algorithmic accountability reports have produced evidence about how platforms operate that was previously unavailable to regulators.

The DSA has created the conditions for better enforcement. Whether that enforcement follows is a political question, not a legal one.

โ€” Prof. Joris van Hoboken, Vrije Universiteit Brussel

What it has not done

The underlying economics of attention platforms remain unchanged. Platforms comply with the letter of DSA transparency requirements while continuing to operate recommendation systems that amplify divisive content.

FALSE

Claim: \"The DSA has reduced disinformation on European platforms by 40%.\"

Verdict: No such measurement exists. This figure appears in no official document.

What would actually work

The research consensus points to two interventions: media literacy education and real algorithmic transparency beyond what the DSA currently requires.

The DSA creates a tiered system of obligations for online platforms based on their size and the risks they pose. Very large online platforms (VLOPs) โ€” those with more than 45 million monthly active users in the EU โ€” face the most demanding requirements: algorithmic audits, researcher data access, crisis protocols, and annual risk assessments. Smaller platforms face lighter obligations.

DSA TIMELINE
2022
DSA adopted โ€” after 14 months of negotiation between Council, Parliament and Commission
2023
VLOPs obligations begin โ€” 19 platforms designated, algorithmic audits required
2024
Full application โ€” all platforms covered; first enforcement actions launched
2025
First major fines โ€” three platforms fined for DSA violations; appeals ongoing

Where enforcement has failed

The Commission has issued violation findings against three platforms and is investigating several more. But the enforcement process is slow โ€” investigations that began in 2023 are still unresolved โ€” and the fines, while nominally up to 6% of global turnover, have not yet been levied at anything approaching that level. Platforms have learned that compliance theatre โ€” publishing reports, creating trust and safety teams, commissioning audits โ€” is sufficient to manage regulatory risk without fundamentally changing their business models.

OVERSTATED
Claim check
"The DSA has significantly reduced disinformation on major platforms"
Platform transparency reports show modest reductions in state-sponsored manipulation networks. But independent researchers find that disinformation volume โ€” particularly AI-generated content โ€” has increased significantly since 2023. The DSA addresses distribution mechanisms, not content creation.

The AI problem the DSA didn't anticipate

The DSA was designed for a world of human-generated content distributed at scale. It was not designed for a world of AI-generated content that is cheap to produce, difficult to detect, and can be targeted with unprecedented precision. Generative AI has changed the economics of disinformation: what previously required state-level resources can now be produced by a motivated individual with a โ‚ฌ20 API subscription.

We wrote the DSA for the disinformation problem of 2020. By the time it came into force, the problem had already changed shape.

What would actually work

The evidence suggests that the most effective interventions against disinformation are not regulatory but epistemic: media literacy programmes, trusted source labelling, friction that slows sharing of unverified content. None of these require the DSA. But none of them are as politically satisfying as the idea of large fines levied against American technology companies.

Regulation can constrain the infrastructure of disinformation. It cannot address the demand for it.