Print

Very large online platforms have the obligation to mitigate systemic risks during electoral processes

17.04.2025

Providers of very large online platforms and very large online search engines are required to assess and mitigate the systemic risks identified, in accordance with Articles 34 and 35 of the Digital Services Act. This obligation derives from the scale of their services (over 45 million monthly active users in the European Union).

Systemic risks are as follows: the dissemination of illegal content, negative effects on the exercise of fundamental rights, negative effects on civic discourse, electoral processes and public safety, negative effects on gender-based violence, protection of public health and minors, and serious negative consequences on the physical and mental well-being of the individual.

The measures for mitigating the identified systemic risks must be reasonable, proportionate and effective, and providers of very large online platforms or very large online search engines must pay particular attention to the impact of those measures on fundamental rights.

To support providers of very large online platforms and very large search engines in managing the risks associated with electoral processes, in order to comply with the obligations of the Digital Services Act, the European Commission has published Guidelines for providers of very large online platforms and very large online search engines on mitigating systemic risks to electoral processes.

Guidelines on mitigating systemic risks to electoral processes

The guidelines recommend the adoption of risk mitigation measures and best practices by very large online platforms and very large online search engines before, during and after elections, such as:

  • strengthening internal processes, including by setting up internal teams with adequate resources who are able to analyse the available information on the risks specific to the local context and how their services are used to search for and obtain information before, during and after the elections;
  • implementing election-specific measures, adapted to each electoral period and each local context. These include promoting official information on electoral processes, media literacy initiatives, and adapting recommendation algorithms to empower users and reduce the monetization of content that threatens the integrity of electoral processes, as well as its viralization. In addition, political advertising needs to be clearly labelled in the light of the new Regulation on the transparency and targeting of political advertising;
  • adopting specific measures to mitigate risks related to generative artificial intelligence: very large online platforms and very large online search engines whose services could be used to create or disseminate AI-generated content should assess and mitigate the related risks by clearly labelling AI-generated content (such as deepfakes), adjusting the terms and conditions of use and enforcing them;
  • cooperating with authorities, independent experts and civil society, at national and European level, to facilitate the effective exchange of information and applying appropriate risk mitigation measures, including in the areas of foreign information manipulation and interference, disinformation and cybersecurity;
  • adopting targeted risk mitigation measures, including an incident response mechanism, during the election period to limit the impact of events that could significantly influence the outcome of elections or voter turnout;
  • assessing the effectiveness of risk mitigation measures through post-election reviews. Very large online platforms and very large online search engines should publish a non-confidential version of these assessments to allow for public feedback on the measures taken.

Very large online platforms and very large online search engines that do not apply the recommendations set out in these guidelines must demonstrate to the European Commission that the measures taken are equally effective in mitigating risks. If the information received casts doubt on the adequacy of such measures, the European Commission may request additional information or may initiate formal procedures under the DSA.

The European Commission has exclusive competence to supervise and ensure compliance by providers of very large online platforms and very large online search engines within the obligations set out in Articles 34 and 35 (risk assessment and mitigation) of the Digital Services Act.

For the development of the guidelines, the Commission also cooperated with the Digital Services Coordinators that are members of the European Digital Services Board.

ANCOM as Digital Services Coordinator (DSC)

ANCOM's role is to supervise intermediary service providers established in Romania and to ensure compliance with the DSA. In case of non-compliance by intermediary service providers established in Romania with the provisions of the DSA, sanctions may be applied, including significant fines.