Policy researchers, Minna Aslama Horowitz (NORDIS) and Madalina Botan (BROD), argue that the EU needs stronger, independent oversight of platforms to curb disinformation.
“Over the past twenty years, digital transformation has profoundly reshaped our information ecosystems,” write Madalina Botan and Minna Aslama Horowitz, arguing that control has shifted “from traditional journalism to a handful of dominant online platforms.” The result is a tangle of information disorders from accidental misinformation to deliberate disinformation and harmful malinformation that distorts debate on topics such as climate, war and migration, the two researchers write in a longer article published by Global Policy.
Based on that the EU has tried to push platforms towards responsibility. In 2018 it launched the Code of Practice on Disinformation, later reinforced and revised, and in 2025 it effectively moved into the EU’s regulatory terrain under the Digital Services Act. Yet a structural tension remains: platforms still control data, design, and visibility of the measures meant to reduce manipulation, leaving “an accountability gap between regulatory ambition and operational reality.”
To test how the system works in practice, they point to an evidence-based benchmark from the European Digital Media Observatory (EDMO), which reviewed major platforms’ transparency reports for January to June 2024. The question, they stress is whether platform claims are verifiable and can be confirmed by external sources.
The picture that emerges is uneven. Platforms advertise media literacy campaigns, but the efforts often look “largely unaccountable, lacking transparency,” with little data on reach or results. Tools meant to help users identify disinformation are described, but impact is hard to judge when reporting is thin and evaluation rare, they write.
On research access Horowitz and Botan highlight how “access to non-personal, anonymized data remains poorly implemented,” with selective programmes, vague criteria and weak governance. They single out Meta’s decision to end CrowdTangle as a blow to scrutiny, noting its replacement offers restricted access through a complex application process.
Fact-checking, another pillar, appears similarly opaque. Partnerships are cited, but platforms does not demonstrate the true effects of the cooperation, and they often leave out basic details on funding, methods and outcomes. The authors conclude that transparency remains fragmented and performative and warn that without stronger enforcement the shift to DSA-linked co-regulation could become more or less symbolic.
The ending conclusion from Horowitz and Botan is clear: “Sustained, verifiable, and inclusive oversight is essential if platform transparency and accountability are to match the scale and urgency of today’s information disorders.”
Madalina Botan, National University of Political Science and Public Administration, Bucharest/ EDMO Hub BROD, is a senior researcher and Associate Professor at the National University of Political Studies and Public Administration
Minna Aslama Horowitz, University of Helsinki/DECA Research Consortium/EDMO Hub NORDIS, is a Docent at the University of Helsinki, a Nordic Observatory for Digital Media and Information Disorder (NORDIS) researcher, and a Fellow at the Media and Journalism Research Center (Estonia/Spain) and St. John’s University, New York.