First evaluation report CoPD in Bulgaria and Romania
Published Friday 19 January 2024 at 11:47
Executive Summary
The EU strategic approach to disinformation does not envisage unified EU legislation directly targeting disinformation and foreign information manipulation and interference but instead recommends the use of legal and financial instruments in a number of interrelated areas. The European Democracy Action Plan includes combating disinformation as one of its main objectives, together with promoting free and fair elections and strengthening media freedom. One of the key objectives of the plan is to transform the Code of Practice on Disinformation, initially adopted in 2018, into a tool for co-regulating the obligations and liability of online platforms. Recognizing the lack of real progress in the implementation of the Code, the Commission pushed for consensus on the Strengthened Code of Practice on Disinformation in 2022. This new code will be developed as part of a broader regulatory framework, in combination with the recently launched legislation on Transparency and Targeting of Political Advertising and the already enacted Digital Services Act and Digital Market Act. The enactment of these co-regulatory instruments should lead to a new wave of obligations and responsibilities for online platforms, creating for the first time a legally binding framework for monitoring disinformation.
In response to criticism of the original Code for its lack of measurable outcomes and clear commitments, the strengthened Code includes specific commitments, corresponding measures, service-level indicators and qualitative reporting elements to increase the accountability of very large online platforms and very large online search engines. The DSA and DMA reinforce the status of the Code, as compliance with the Code or a similar code is considered an appropriate risk mitigation measure for online platforms.
The external evaluations of the platforms' transparency reports, published in January 2023 and the delayed "July 2023" reports, published in October 2023, have shown significant shortcomings in the self-regulatory effectiveness of the Code, particularly with regard to small language markets such as Bulgaria and Romania, which are not considered a priority by the tech platforms. The underdeveloped institutional and regulatory framework for combating disinformation and foreign information manipulation and interference confirms the need for more effective implementation of the Code as an important measure to improve societal resilience in these countries. The case of Facebook's problematic moderation practices in Bulgaria, which led to the termination of the contract between Meta and the subcontracted company responsible for moderating Bulgarian-language content, demonstrated the shortcomings and ineffective implementation of Meta's commitments under the Code. The lack of information on Meta's response to inauthentic behaviour, even after an official request from the relevant state authority, as well as the very modest response (i.e., removal of 2 accounts in a 6-month period) that was finally disclosed by Meta, demonstrated both non-compliance with the commitments undertaken under the Code and the lack of mechanisms to hold Meta accountable both to society and to state authorities.
The analysis of the implementation of the Code of Practice on Disinformation by the main online platforms in Bulgaria and Romania for the first six months of 2023 illustrates the multifaceted reality of the fight against misinformation and the challenges faced by the
platforms to comply with the undertaken commitments. Moreover, the political discussions on the implementation of the European regulations in both countries are focused on the introduction of the DSA, while the CoP remains in the background. In Bulgaria in particular, national institutions and regulators have yet to formulate a comprehensive, long-term strategic vision for combating disinformation, including the implementation of relevant European regulations. There is also a lack of consistent commitment from key political leaders to counter foreign influence operations and disinformation campaigns, whether through social media or traditional media. Suspicions of political interference in the work of national media regulators in both Bulgaria and Romania call for the creation of conditions to ensure the independence of these regulators in the future.
This is particularly true in light of the allocation of the role of National Digital Services Coordinator under the DSA, which is expected to have even broader powers and responsibilities. Some of the main conclusions and recommendations for the implementation of the Code in both countries are:
● The necessary national legislation and the respective institutional framework (e.g. Digital Services Coordinators under the DSA) should be developed to make full use of the existing EU co-regulatory framework, e.g. DSA, DMA and the Code of Practice or Disinformation, in order to achieve better enforcement of the obligations and commitments of tech platforms in Bulgaria and Romania.
● Strengthen coordination between the European Commission and the relevant national authorities to improve the implementation of the respective CoP commitments with respect to both countries.
● More comprehensive and time-series data at the country level is needed for all Code measures. Where structured data are not available, more specific and detailed information should be provided. With respect to both structured data and information, VLOPs and VLOSEs must also provide non-anonymous information, particularly with respect to the names of removed, blocked or restricted pages and accounts, etc. Data and information for consecutive years should be harmonized to allow for comparative analysis across years. Where necessary, newly added data or information should be complemented by historical series, where available;
● Where possible, the templates for the data provided by the platforms must be standardized across different reporting rounds (i.e. years) and across platforms. The data formats should allow for automated processing by researchers. Ideally, the EC could provide a data space (or data warehouse) built specifically for this purpose, with consistent requirements and rules for data integration, with appropriate access rights and obligations;
● The use of labels and warning systems by VLOPs should be clearer and more consistent. In this context, the development of fact-checking and pre-bunking capacities in both countries should be a coordinated initiative by both public authorities and technical platforms.
● VLOPs should share compliance data that includes broad and representative examples of moderated, demonetized, and removed content. This is essential to enable researchers to verify that platforms are implementing their policies accurately and to ensure that authentic content and accounts are not unfairly suppressed due to bias or errors in algorithmic or human moderation.
● Monitoring and evaluation of the implementation of the Code by relevant government authorities is crucial, but independent assessments by researchers, academics, and NGOs are also urgently needed to provide insights into how well these platforms are complying with the Code. In addition, such cooperation could help reduce political interference in the design and work of the media regulatory bodies in both countries.
Full report: