Europe's Disinformation Problem Has No Easy Fix
Two years after the EU's flagship regulation took effect, the information environment is more fragmented than ever.
The Digital Services Act was supposed to be Europe's answer to the disinformation crisis. Two years after its major provisions entered into force, a clear-eyed assessment is warranted.
What the DSA has achieved
The DSA has established a legal framework treating large platforms as infrastructure subject to public interest obligations. Mandatory risk assessments, researcher data access, and algorithmic accountability reports have produced evidence about how platforms operate that was previously unavailable to regulators.
The DSA has created the conditions for better enforcement. Whether that enforcement follows is a political question, not a legal one.
โ Prof. Joris van Hoboken, Vrije Universiteit Brussel
What it has not done
The underlying economics of attention platforms remain unchanged. Platforms comply with the letter of DSA transparency requirements while continuing to operate recommendation systems that amplify divisive content.
Claim: \"The DSA has reduced disinformation on European platforms by 40%.\"
Verdict: No such measurement exists. This figure appears in no official document.
What would actually work
The research consensus points to two interventions: media literacy education and real algorithmic transparency beyond what the DSA currently requires.
The DSA creates a tiered system of obligations for online platforms based on their size and the risks they pose. Very large online platforms (VLOPs) โ those with more than 45 million monthly active users in the EU โ face the most demanding requirements: algorithmic audits, researcher data access, crisis protocols, and annual risk assessments. Smaller platforms face lighter obligations.
Where enforcement has failed
The Commission has issued violation findings against three platforms and is investigating several more. But the enforcement process is slow โ investigations that began in 2023 are still unresolved โ and the fines, while nominally up to 6% of global turnover, have not yet been levied at anything approaching that level. Platforms have learned that compliance theatre โ publishing reports, creating trust and safety teams, commissioning audits โ is sufficient to manage regulatory risk without fundamentally changing their business models.
The AI problem the DSA didn't anticipate
The DSA was designed for a world of human-generated content distributed at scale. It was not designed for a world of AI-generated content that is cheap to produce, difficult to detect, and can be targeted with unprecedented precision. Generative AI has changed the economics of disinformation: what previously required state-level resources can now be produced by a motivated individual with a โฌ20 API subscription.
What would actually work
The evidence suggests that the most effective interventions against disinformation are not regulatory but epistemic: media literacy programmes, trusted source labelling, friction that slows sharing of unverified content. None of these require the DSA. But none of them are as politically satisfying as the idea of large fines levied against American technology companies.