MVI Methodology: Monitoring Framework for Code of Conduct on Disinformation
Published Wednesday 14 January 2026 at 18:00
Executive Summary
Research partners in Bulgaria and Romania and the European Observatory of Digital Media (EDMO) have created a first of its kind methodology (Materiality, Verifiability, Impact, or MVI) for monitoring the compliance of Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs) with the EU’s strengthened Code of Conduct on Disinformation.
This report is situated within the evolving regulatory landscape of the Digital Services Act (DSA), which, as of July 2025, elevated the Code from a voluntary commitment to a formal compliance benchmark. The primary objective of this document is to finalize a robust assessment model and validate it through a pilot analysis of the September 2025 transparency reports submitted by Meta, Google, TikTok, and LinkedIn.
The core of this methodology is the MVI Framework, which evaluates platform performance across three distinct axes:
- Materiality (M): Assessing the existence, deployment, and enforcement of required systems, tools, and policies.
- Verifiability (V): Evaluating the quality, precise definition, and auditability of the evidence provided.
- Impact (I): Measuring the scope and granularity of quantitative outcome metrics, particularly at the Member State level.
By applying a Granularity Cap, the methodology ensures that platforms providing only global or EU-aggregate data—thereby obscuring national performance—are structurally penalized in their final scores.
The framework is specifically tailored to the regulatory and media environments of Bulgaria and Romania, regions characterized by high vulnerability to coordinated disinformation and structural deficits in media literacy. Yet, a comparative case study of the application of the methodology in Slovenia, Croatia, Bulgaria and Romania was conducted right after its pilot. A paper, presenting the results and the methodology itself, will be published by Taylor & Francis Group as a chapter in a special edition booked called “Big Tech, Society and Systemic Risks.
Key Findings
1. The Materiality-Transparency Paradox
A recurring theme across all five platform audits (Facebook, Instagram, Google Search, YouTube, TikTok, and LinkedIn) is a high level of Materiality (M=3) coupled with critically low Verifiability and Impact scores. This indicates that while platforms have built the required infrastructure—such as ad repositories, researcher APIs, and fact-checking partnerships—they systematically fail to provide the localized, quantitative evidence necessary for national regulators in Bulgaria and Romania to audit their effectiveness.
2. Universal Impact of the Granularity Cap
The most significant constraint on compliance scores is the systemic absence of Member State-level data. For example, Meta reported actioning 1.1–1.4 billion fake accounts globally, and Google reported enforcing policies against 37.4 million domains, yet neither provided the specific breakdown for Bulgaria or Romania. Under the MVI rules, this triggers a mandatory score cap (V≤1, I≤1), preventing platforms from using large global statistics to mask a lack of national accountability.
3. Strategic Withdrawal from Commitments
The H1 2025 reports revealed a troubling trend of platforms retracting from previously held commitments. Google formally withdrew from all fact-checking and political advertising chapters, while LinkedIn executed a broad withdrawal across research access and fact-checking. TikTok also withdrew from all political advertising commitments, likely in response to the stringent requirements of the new EU Regulation on the Transparency and Targeting of Political Advertising (TTPA).
4. Exceptional vs. Consistent Transparency The pilot analysis identified rare instances where platforms successfully bypassed the Granularity Cap, demonstrating that localized reporting is technically feasible.
- YouTube provided a specific language breakdown for Romania, noting the termination of 80 Romanian-language channels as part of coordinated influence operations.
- TikTok provided substantial quantitative data during the Romanian Election Crisis, detailing 2 million Election Center visits and 45 million video views, resulting in the highest impact scores in the study.
- However, these instances remain isolated exceptions rather than standard practice, with Bulgaria consistently receiving less granular data than Romania.
5. Regional Vulnerabilities and Moderation Gaps
Bulgaria and Romania remain highly susceptible to foreign malign narratives, often amplified by local political actors. A critical weakness identified is the under-investment in local-language moderation. As small language markets, these countries lack the necessary linguistic and cultural expertise from global platforms to effectively identify and disrupt coordinated campaigns before they reach a wide audience.
6. Deficiencies in Research Access and Appeals
Across the board, transparency regarding researcher access and user appeals is weak. LinkedIn’s research program remains in beta with no usage metrics, while Meta and Google provide only global aggregates for research tool uptake. Furthermore, several platforms explicitly admitted that "no reporting is possible" for certain key metrics, such as ad appeals, representing a major failure in due process transparency.
Conclusion
Evaluating platform compliance using global aggregates rather than national data is like a global shipping company claiming its fleet is perfectly safe because 99% of its ships worldwide arrived on time, while ignoring the fact that every single ship destined for two specific small ports (Bulgaria and Romania) has consistently sunk. Without seeing the data for those specific ports, the local authorities cannot know the actual risk to their waters.
The 2025 pilot demonstrates that the primary accountability gap has shifted from an absence of policy to an absence of audit-ready, country-specific outcome data. For the co-regulatory regime of the DSA to be effective, platforms must transition from descriptive, global reporting to systematic, granular disclosures that reflect the real-world impact of their interventions in Member States like Bulgaria and Romania.
Read the full report below:
