Free cookie consent management tool by TermsFeed Actualizare preferințe cookie-uri

Commission endorses the integration of the voluntary Code of Practice on Disinformation into the Digital Services Act

The Commission and the European Board for Digital Services endorsed the integration of the voluntary Code of Practice on Disinformation into the framework of the Digital Services Act (DSA). This integration will make the Code a benchmark for determining platforms' compliance with the DSA. The new Code has 42 signatories, including very large online platforms and search engines (Google, Meta, Microsoft și TikTok).

The Code will become a significant and meaningful benchmark for determining DSA compliance. Compliance with the commitments under the Code will also be part of the annual independent audit, which these platforms are subject to under the DSA.

The Code of Conduct on Disinformation

The Code is a widely recognised, robust set of commitments that together constitute a strong set of mitigation measures for DSA compliance. The value of these commitments lies in the fact that they are the result of an agreement between a broad set of actors, building on existing best industry practices. Taking into account the complexity and challenges related to tackling the spread of disinformation, the Code contains different, but interlinked areas:

  • Demonetisation: cutting financial incentives for purveyors of disinformation;
  • Transparency of political advertising: more efficient labelling for users to recognise political advertising;
  • Ensuring the integrity of services: reducing fake accounts, bot-driven amplification, malicious deep fakes and other manipulative behaviour used to spread disinformation;
  • Empowering users, researchers and the fact-checking community: better tools for users to identify disinformation, wider access to data, fact-checking coverage across the EU.

These measures combat disinformation risks while fully upholding free speech and enhancing transparency.

As part of their respective assessments of whether the Code meets the criteria specified under Article 45 of the DSA, the Commission and the European Board for Digital Services encourage the signatory platforms to take into account several recommendations when implementing the Code of Conduct on Disinformation.

This includes promptly finalising the Rapid Response System to cover all national elections and crisis and implementing it effectively; a swift Taskforce discussion and concrete follow-up regarding their commitments in the key areas mentioned above; and providing all the necessary data to fill the gaps in their reporting and to allow the further development and efficient measurement of structural indicators – including new ones.

The legal framework for online platforms regarding electoral processes

ANCOM highlights the importance of other two documents that must guide online platforms in terms of electoral processes and that must become a benchmark for participants who hold different roles in the electoral context:

The Authority reiterates the important role of the European Commission in relation to large platforms, especially with regard to the application of Articles 34 and 35 of the Digital Services Regulation.  The two articles oblige VLOP and VLOSE to identify, analyze and implement measures to mitigate systemic risks associated with their services, including those related to disinformation, public safety and the impact on fundamental rights.

Background

As provided for in Article 45 of the Digital Services Act, the European Commission and the European Board for Digital Services encourage and facilitate the development of voluntary codes of conduct at Union level that contribute to the proper application of the Digital Services Act, taking into account in particular the specific challenges related to addressing different types of illegal content and systemic risks. The Code of Practice on Disinformation will take effect from 1 July 2025. The first iteration of this code was signed in 2018 and revised in 2022

Commission endorses the integration of the voluntary Code of Practice on Disinformation into the Digital Services Act

The Commission and the European Board for Digital Services endorsed the integration of the voluntary Code of Practice on Disinformation into the framework of the Digital Services Act (DSA). This integration will make the Code a benchmark for determining platforms' compliance with the DSA. The new Code has 42 signatories, including very large online platforms and search engines (Google, Meta, Microsoft și TikTok).

The Code will become a significant and meaningful benchmark for determining DSA compliance. Compliance with the commitments under the Code will also be part of the annual independent audit, which these platforms are subject to under the DSA.

The Code of Conduct on Disinformation

The Code is a widely recognised, robust set of commitments that together constitute a strong set of mitigation measures for DSA compliance. The value of these commitments lies in the fact that they are the result of an agreement between a broad set of actors, building on existing best industry practices. Taking into account the complexity and challenges related to tackling the spread of disinformation, the Code contains different, but interlinked areas:

  • Demonetisation: cutting financial incentives for purveyors of disinformation;
  • Transparency of political advertising: more efficient labelling for users to recognise political advertising;
  • Ensuring the integrity of services: reducing fake accounts, bot-driven amplification, malicious deep fakes and other manipulative behaviour used to spread disinformation;
  • Empowering users, researchers and the fact-checking community: better tools for users to identify disinformation, wider access to data, fact-checking coverage across the EU.

These measures combat disinformation risks while fully upholding free speech and enhancing transparency.

As part of their respective assessments of whether the Code meets the criteria specified under Article 45 of the DSA, the Commission and the European Board for Digital Services encourage the signatory platforms to take into account several recommendations when implementing the Code of Conduct on Disinformation.

This includes promptly finalising the Rapid Response System to cover all national elections and crisis and implementing it effectively; a swift Taskforce discussion and concrete follow-up regarding their commitments in the key areas mentioned above; and providing all the necessary data to fill the gaps in their reporting and to allow the further development and efficient measurement of structural indicators – including new ones.

The legal framework for online platforms regarding electoral processes

ANCOM highlights the importance of other two documents that must guide online platforms in terms of electoral processes and that must become a benchmark for participants who hold different roles in the electoral context:

The Authority reiterates the important role of the European Commission in relation to large platforms, especially with regard to the application of Articles 34 and 35 of the Digital Services Regulation.  The two articles oblige VLOP and VLOSE to identify, analyze and implement measures to mitigate systemic risks associated with their services, including those related to disinformation, public safety and the impact on fundamental rights.

Background

As provided for in Article 45 of the Digital Services Act, the European Commission and the European Board for Digital Services encourage and facilitate the development of voluntary codes of conduct at Union level that contribute to the proper application of the Digital Services Act, taking into account in particular the specific challenges related to addressing different types of illegal content and systemic risks. The Code of Practice on Disinformation will take effect from 1 July 2025. The first iteration of this code was signed in 2018 and revised in 2022

All rights reserved 2025