
Risk Assessment a Good Practice for Curbing Disinformation? EU Candidate Advocates Still Say Yes
Too many laws attempting to regulate disinformation end up suppressing legitimate speech. A risk assessment and mitigation approach is the new way forward to curb mis and disinformation from a policy perspective, argues the authors of this blog post.
This entry first appeared on Tech Policy Press, published 26 May 2025
A conference centre in Brussels was packed to the brim on 7 May 2025, when the European Commission hosted over 200 stakeholders for a workshop on assessment and mitigation of systemic risks as identified under the EU’s Digital Services Act (DSA). This first-of-its-kind legislation was created to reduce harms and bolster user protections online throughout the EU.
The DSA has given high hopes to many seeking to hold Big Tech accountable with its special obligations for designated “Very Large Online Platforms” and “Very Large Online Search Engines” (VLOPSEs). Its systemic risk assessment and mitigation framework, particularly, has kept stakeholders busy on all sides. This European Commission workshop was just one of many occasions since the legislation came into effect, where civil society representatives, academics, and these major tech companies have been invited to engage with each other.
The risk assessment and mitigation framework mandates the biggest platforms and search engines in the EU to assess potential threats that their services might pose to society at least once a year. By addressing overarching systems and processes rather than specific cases, this exercise has been dubbed more future-proof. The DSA has created a flexible and adaptable regulatory framework to protect democratic processes from disinformation and other online threats that can evolve with technological advancements.
Embracing the approach
Though not without recognizing its potential flaws, we argue that a risk assessment and mitigation approach is the new way forward to curb mis and disinformation from a policy perspective while also protecting human rights. As too many laws attempting to regulate disinformation have proven to inadvertently suppress legitimate speech through subjective definition and, in the wrong hands, intentionally silence dissent and chill press freedom under the guise of combating falsehood, we choose to embrace this legislation as a promising EU initiative.
EU candidate countries have a particular incentive to align their regulatory landscapes with the DSA as a condition for accession. Most are taking a combination approach, developing their own DSA-like legislation or assessing readiness for harmonization with national legislation while simultaneously building bridges of cooperation with VLOPSEs, relevant EU country-level bodies, and the European Commission to promote systemic risk mitigation.
The more democratic EU neighbors, such as Ukraine and Moldova, are understanding the complexity of copy-pasting the existing regulation. In close cooperation with the European Commission, the two are in the throes of introducing DSA-like provisions without excessively harming the political discourse while also trying to ensure sufficient protection from Russian aggression.
On the other hand, in countries on the path towards authoritarianism like Georgia, the adoption of DSA-like legislation could too easily be misused as a tool for censorship rather than platform accountability. This is where responsible proponents of free speech are instead using their right to get creative with the law and scope possibilities to take advantage of existing DSA provisions. If the case is built that information integrity risks outside the EU do indeed pose systemic risk in the Union, mitigation measures would be required from VLOPSEs under the DSA—a big win for digital rights advocates everywhere.
Motivation stemming from the East
Central to EU candidate advocate calls for a risk assessment and mitigation approach are regulatory blunders and blatant abuses of State power, such as Russia’s 2022 ‘fake news’ law, which criminalizes independent reporting following its full-scale invasion of Ukraine, This troubling legislation is a potent example of how regulating disinformation on a content basis can be gravely threaten civic discourse. Now backed by the European Court of Human Rights as a violation of freedom of expression, any contradiction of Russia’s official narrative describing the invasion as a “special military operation” is punishable by up to 15 years in prison.
Alternatively, by recognizing the pitfalls of identifying expressions that must be prohibited, the DSA answers the demands of modern times much more than the straightforward application of blocking measures. The DSA helps preserve online platforms as the agora for societally important discussions by requiring them to strengthen internal safeguards for freedom of expression. While the DSA recognises illegal content, the division of powers within its design bars new content rules from being drawn up by the European Commission and therefore prevents ad hoc regulation.
The DSA is indeed a second-generation law—one that promotes transparency, platform accountability, and multi-stakeholder participation in mitigating information risks without compromising democratic values—and it is the basket we are putting our eggs in.
Reservation remains
This endorsement, however, comes on the condition that the appropriate rule-of-law safeguards in each respective country must be well-placed alongside. Reporting from the companies obliged must also be sufficiently comprehensive and transparent.
Access to relevant data is notably integral to understanding the risks and the measures to mitigate them, which was expressed repeatedly throughout this most recent European Commission workshop as a criticism of the annual process. Some civil society organizations are even contemplating pulling out of meetings with Big Tech altogether until they hand over the hard evidence needed to assess the accuracy of the analysis companies have provided.
Related to these stipulations are the significant rollbacks related to combatting disinformation from some of the major platforms and search engines that have increasingly shown to change their policies when expedient. Notably, according to Democracy Reporting International, platforms have reduced the number of measures committed to in the recently integrated DSA Code of Conduct on Disinformation by almost one-third. As political instability is on the rise across Europe and beyond, challenges to the stringent enforcement of the DSA or any similar legislation are apparent.
Even with these obstacles, those based outside the EU in the business of intermediary liability have taken an interest in this proactive and co-regulatory approach to promoting information integrity and societal resilience. Some are seeking to mimic a larger part of its regulatory framework in their own jurisdictions, while others are exploring opportunities to benefit from its standing EU-based mechanisms. Civil society groups from around the world have also come together to leverage its framework, and UNESCO has produced guidelines supporting a risk assessment and mitigation methodology to promote better digital governance globally.
As the press continues to make the most of the DSA, we advocate for the global digital rights community to take the tip to push forward in a united front to fight the spread of disinformation. Though contingent on sufficient transparency, standing behind the call for consistent regulation in the form of a risk assessment and mitigation framework will foster the more accountable digital environment the rights community seeks.
IMS’ High Level Expert Group for Resilience Building in Eastern Europe (HLEG) was comprised of approximately 14 experts, seven from the Eastern European region and seven with global profiles, to offer recommendations about a best way forward for a tailored implementation of DSA-inspired legislation in candidate countries Georgia, Moldova, and Ukraine. The project was set in the context of the “super election” year (2024) and was supported by the Danish Ministry of Foreign Affairs.
About the authors

Colette Wahlqvist is an international human rights lawyer and advisor at IMS (International Media Support), a media development organization working to reduce conflict, strengthen democracy and facilitate dialogue in countries of armed conflict, political transition or human insecurity. She leads research, evidence-based advocacy and the advancement of strategic partnerships at the intersection of journalist safety and digital rights. Between 2024-2025, Colette served as lead of the IMS Secretariat for the High-Level Expert Group for Resilience Building in Eastern Europe, tasked to offer recommendations about a best way forward for a tailored implementation of EU Digital Service Act-inspired legislation in Georgia, Moldova, and Ukraine. She also fulfils a leadership role within the Coalition Against Online Violence and represents IMS on the Consultative Network of the Media Freedom Coalition.

Maksym Dvorovyi is a Kyiv-based digital rights and media law and policy expert currently serving as Senior Legal Counsel at Digital Security Lab Ukraine (DSLU), a non-governmental organisation dedicated to advocating for a human-rights-centred digital environment. He is also a PhD student at the National University of Kyiv-Mohyla Academy, researching the implications of the Brussels effect on platform regulation in Ukraine. Since 2015, he has actively contributed to media law and policy reforms in Ukraine, focusing on areas such as public service broadcasting, audiovisual and online media, and platform regulation. He was one of the drafters of Ukraine’s Law on Media and currently assists local authorities in the DSA implementation. Internationally, he has been working on making platforms more accountable in their content moderation efforts in times of conflict and ensuring a common approach to the DSA transposition throughout the EU Candidate States.

Dr. Nina Shengelia, a UK-qualified lawyer based in Tbilisi, Georgia (currently a Policy Leader Fellow at European University Institute in Florence), combines her expertise in academics, legal practice, and media literacy with a focus on disinformation. Her research centers on platform governance, digital rights, and the extraterritorial impact of the EU’s Digital Services Act (DSA), with a specific focus on Georgia, Ukraine, and Moldova. Her work includes analysis of trusted flagger mechanisms and systemic risk assessments under the DSA framework. Dr. Shengelia serves on the High-Level Expert Group at International Media Support (IMS) that explores extraterritorial application of DSA with a particular focus on Georgia, Moldova, and Russia.