Skip to main content

EU Code of Practice on Disinformation Becomes Code of Conduct

The voluntary Code of Practice on Disinformation is to be integrated into the framework of the EU Digital Services Act (DSA) and used as a benchmark for determining platforms’ compliance.

The Code of Practice on Disinformation was designed as a first-of-its kind tool through which Relevant Signatories agreed – for the first time in 2018  – on self-regulatory standards to fight disinformation. At the core of the EU strategy against disinformation, the Code works to limit the spread of online disinformation, including during electoral periods, and to quickly respond to crises, such as Covid and the war in Ukraine. 

On 13 February 2025, the European Commission and the European Board for Digital Services endorsed the integration of the 2022 Code of Practice into the framework of the DSA as a Code of Conduct on Disinformation. This integration will make the Code a benchmark for determining platforms’ compliance with the DSA.

What does this mean for the fight against disinformation?

In signing this Code, Relevant Signatories commit to carrying out a range of measures to tackle disinformation, including:

  • defunding the dissemination of disinformation and improving the policies and systems that determine the eligibility of content to be monetised
  • preventing the misuse of advertising systems to disseminate disinformation in the form of advertising messages
  • clearly labelling paid-for content in a way that allows users to understand that the content displayed contains political or issue advertising*
  • engaging in ongoing monitoring and research to understand and respond to risks related to disinformation in political or issue advertising

What does this mean for media literacy?

Part five of the Code focuses on empowering users with requirements for Relevant Signatories, including the following:

  • recognise the importance of diluting the visibility and permeation of disinformation by continuing to improve the findability of trustworthy content, enhance the safe design of their services and empower users with dedicated tools to identify disinformation and empowering users with tools to detect and report these types of content
  • recognise the importance of the potential of technology to empower users with tools to interrogate the origin and authenticity of content, in order to help users to determine the veracity of content
  • recognise the importance of enhancing their efforts in the area of media literacy, including to protect and empower vulnerable groups
  • recognise the importance of intensifying their actions for a safer design and architecture of their services in order to mitigate the risks of viral propagation of disinformation
  • acknowledge the significant impact that recommender systems have on the information diet of users, and therefore recognise that recommender systems should be transparent and provide users with the possibility to modify at any time their preferred options for the way information is recommended to them
  • recognise that facilitating users’ access to tools that can support their assessment of the factual accuracy of sources – for example through fact-checks from independent fact-checking organisations or warning labels from other authoritative sources – is crucial to curbing the disinformation phenomenon
  • consider it important to research on the feasibility and effectiveness of developing warnings or updates targeted to users that have interacted with content that was later removed for violation of their policies
  • recognise the importance of testing and implementing technical features helping users to identify and flag disinformation disseminated through private messaging applications and exploring with fact-checkers privacy-compliant opportunities to integrate their work into such services
  • respect the right to freedom of expression, the right to private communications, the right to protection of personal data, and that the user’s right to an effective remedy and shall not be disproportionate.

Three measures are required to satisfy Commitment 17 of the Code of Conduct on Disinformation (p. 25) which relate directly to media literacy.

Measure 17.1 requires Relevant Signatories to design and implement or continue to maintain tools to improve media literacy and critical thinking, for instance by empowering users with context on the content visible on services, or with guidance on how to evaluate online content.

Measure 17.2 requires Relevant Signatories to develop, promote and/or support or continue to run activities to improve media literacy and critical thinking such as campaigns to raise awareness about disinformation, as well as the TTPs (Tactics, Techniques and Procedures) that are being used by malicious actors, among the general public across the European Union, also considering the involvement of vulnerable communities.

Measure 17.3. calls on Relevant Signatories to partner or consult with media literacy experts in the EU, in the design, implementation and impact measurement of tools, relevant including for instance the Commission’s Media Literacy Expert Group, ERGA’s Media Literacy Action Group, EDMO and its country-specific branches, or relevant Member State universities or organisations that have relevant expertise.

The Qualitative Reporting Elements (QREs) that will demonstrate that these measures have been implemented include:

  • outlining the tools they develop or maintain to improve media literacy and critical thinking and report on their deployment in each Member State
  • describing the activities they launch or support and the Member States they target and reach and reporting on actions taken to promote the campaigns to their user base per Member States targeted
  • describing how they involved and partnered with media literacy experts for the purposes of all measures in this commitment.

The Service Level Indicators that Signatories are required to report, at Member State level include:

  • metrics pertinent to assessing the effects of the tools described in the qualitative reporting element for Measure 17.1, which will include: the total count of impressions of the tool; and information on the interactions/engagement with the tool
  • the number of media literacy and awareness-raising activities organised and/or participated in and will share quantitative information pertinent to show the effects of the campaigns they build or support at the Member State level (for instance: list of Member States where those activities took place; reach of campaigns; engagement these activities have generated; number of interactions with online assets; number of participants).

Full adherence to the Code may be considered as an appropriate risk mitigation measure for signatories designated as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) under the DSA. This is a positive step in the fight against disinformation and towards strengthening of media literacy initiatives across a range of online platforms.

*Issue advertisements intend to persuade people to change their opinions or behaviour, rather than trying to sell them something