Third evaluation report CoPD in Bulgaria and Romania
Published Sunday 25 May 2025 at 22:55

Executive Summary
This third report on the implementation of the Digital Services Act and the Code of Practice on Disinformation (CoPD) in Romania and Bulgaria, prepared by the Bulgarian-Romanian Observatory of Digital Media (BROD), provides essential insights into institutional and policy developments in the two countries, as well as on the scope of actions of online platforms in responding to their obligations to address the systemic risk of disinformation among users in the region. It highlights systemic challenges and charts actionable strategies for more effective responses to information integrity threats and digital governance.
Romania has actively pursued the establishment of ANCOM as its Digital Services Coordinator (DSC), adopting key legislation and procedural mechanisms. Nonetheless, the annulment of the November 2024 presidential elections due to significant online manipulation and suspected foreign interference primarily through TikTok underscored critical vulnerabilities. This situation led to enhanced national legislative measures and direct scrutiny by the European Commission, emphasizing urgent needs in institutional preparedness, technical expertise, and transparency. Despite significant progress, Romania faces challenges including inadequate internal expertise within ANCOM, absence of structured advisory bodies, limited trusted flagger networks, and restricted access for independent researchers to platform data.
In Bulgaria, progress on DSA implementation has been notably slower. Although the Communications Regulation Commission (CRC) was officially appointed as DSC, the lack of adequate empowerment through legislation and effective enforcement has hindered compliance. This has triggered infringement procedures by the European Commission. The Bulgarian digital information environment is marked by vulnerabilities such as extensive disinformation campaigns, often linked to external actors like Russia, limited media freedom, low media literacy among the public, and insufficient transparency and accountability from platforms. Specific gaps include delayed legislative action, fragmented institutional mandates and lack of a national strategic framework, absence of trusted flaggers, insufficient resources for fact-checking initiatives, and the smaller market disadvantage.
Evaluations of CoPD implementation indicate notable but varied engagement by Very Large Online Platforms (VLOPs). While platforms made visible efforts, the implementation remains uneven and fragmented at the national level, lacking consistency, local relevance, and contextual specificity. In Bulgaria, all four monitored platforms showed efforts across the three CoPD pillars: Advertising and Political Advertising, Integrity of Services, and Empowering Users, the Research Community, and Fact-Checkers. TikTok stood out with comprehensive data on fake account removals, video labelling, and regional fact-checking initiatives. It removed over 423,000 fake accounts during the reporting period but these had very limited reach, suggesting a shift in strategy by malign actors. Meta provided over 34,000 fact-checking labels on Facebook and 8,300 on Instagram, though ad removals for misinformation remained relatively low at just over 1,100. Google reported millions of ad removals, though most were not specifically tied to disinformation, and provided media literacy tools like Fact Check Explorer and Google Trends, but with limited localized data. Microsoft reported strong ad enforcement statistics, including over 8,000 ad restrictions and hundreds of thousands of blocked impressions, yet provided no disaggregated country-level data on other key CoPD pillars. Overall, gaps remain, particularly regarding the depth and specificity of reported data, limited country-specific initiatives, and inconsistent application of content moderation policies.
In Romania, Google, Meta, and TikTok's efforts revealed varying depth and focus. Google partnered with the Ministry of Digitalisation, launched voter engagement features, and protected media/civil society from cyber threats. Meta focused on scale, labelling over 80,000 political ads and removing over 6,300 ads for misinformation violations, one of the highest numbers in the EU. TikTok showed decisive action, removing 10,698 political ads (the highest in the EU), but only two were removed under disinformation policies. TikTok also included Romania in its rapid response systems and conducted media literacy campaigns. The context of the European Commission's formal proceedings against TikTok regarding its role in the 2024 Romanian presidential election highlighted concerns about systemic risks and recommender systems. Gaps remain, particularly regarding the depth and specificity of reported data, limited country-specific initiatives, and inconsistent application of content moderation policies. Significant discrepancies and limitations persist in transparency reporting, granular data provision, and strategic alignment with national contexts.
Recommendations for both countries include finalizing legislative frameworks to empower DSCs effectively, establishing structured multi-stakeholder national coordination mechanisms and independent advisory bodies, boosting technical and human capabilities of national regulators, enhancing support and resources for trusted flaggers and fact-checkers, initiating nationwide media literacy programs, improving data transparency for researchers, and implementing targeted strategies to counter foreign interference in the digital space.
Overall, effective digital governance in the EU requires robust legal frameworks, independent and empowered national authorities, transparent and accountable platform practices, multi-sector collaboration, and comprehensive public engagement. The journeys of Romania and Bulgaria highlight the challenges, emphasizing the need for determined national action and shared European commitment to safeguarding the online sphere's integrity.
The Bulgarian-Romanian Observatory of Digital Media (BROD) offers substantial regional expertise in media literacy, disinformation monitoring, and cross-border narrative mapping. National authorities in Bulgaria and Romania should recognise its role as a strategic knowledge partner and integrate it as a key actor within national multi-stakeholder cooperation frameworks, drawing on its capabilities to support contextual analysis of disinformation trends, advising on implementing the new Code of Conduct on Disinformation, and participating in the design and testing of national risk mitigation protocols. Integrating BROD's expertise and tools would provide national actors with critical capacity grounded in local context.
Full report:
