Free cookie consent management tool by TermsFeed Actualizare preferințe cookie-uri
Press Releases

Supervision of the designated very large online platforms and search engines under DSA

26.11.2024

According to the DSA Regulation, in any context - including that of the electoral process - the supervision of very large online platforms (VLOP, such as TikTok) and very large search engines (VLOSE, such as Google or Bing) lies within the competence of the European Commission.

ANCOM is the national Digital Services Coordinator, however, in this capacity it cannot take any measures against VLOP or VLOSE. ANCOM`s area of responsibility is limited to providers of intermediary services established on the national territory.

ANCOM has always assured the necessary framework for a good collaboration between the parties involved: the large platforms and the relevant authority – the Permanent Electoral Authority (PEA). Furthermore, ANCOM is maintaining permanent contact with the European Commission, starting with February 2024, within the limits of its attributions under the DSA Regulation and according to Law no. 50/2024.

Throughout the electoral campaigns, ANCOM worked closely with PEA - the authority responsible for the smooth running of the electoral process -, the relevant ministry, and the European Commission. According to the data available to ANCOM, in compliance with the legal provisions and attributions incumbent on it, AEP sent notifications to the TikTok platform reporting various irregularities related to the distribution of illegal content and requesting it to take the necessary measures for the legal conduct of the electoral campaign in Romania, but TikTok did not act with celerity at the request of the Romanian authority. Such requests were also sent to other digital platforms.

The situation was brought to the attention of the European Commission during the discussions organized by ANCOM together with the AEP in recent months, including today.

Obligations for VLOPs and VLOSEs

According to Article 34 of the DSA, VLOP and VLOSE shall assess all systematic risks at least once every year, and, in any event, prior to deploying new functionalities that are likely to have a critical impact (e.g. new functionalities in the TikTok shop). The risk assessments shall be submitted to the European Commission and, upon request, to the DSC of establishment (in this case, the Republic of Ireland). These risks include:

  • the dissemination of illegal content through their services;
  • any actual or foreseeable negative effects for the exercise of fundamental rights;
  • any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;
  • any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

Furthermore, VLOPs and VLOSEs shall put in place reasonable, proportionate and effective risk mitigation measures, tailored to the specific systematic risks identified pursuant to Article 34, paying a particular attention to the impact of those measures on fundamental rights.

Precisely because in this period elections are held in many countries, the European Commission has adopted a series of measures to be considered by the VLOPs and VLOSEs to mitigate the systemic risks of the electoral processes. The Commission`s guide is publicly available – EUR-Lex - 52024XC03014 - RO - EUR-Lex.

ANCOM and AEP have informed the major platforms regarding their obligations in the electoral context, ever since August.

Therefore, VLOPs and VLOSEs have also transparency reporting obligations in which they must include any content moderation. The reports, submitted to the European Commission, must include:

  • human resources dedicated by the providers of very large online platforms to content moderation in relation to the service offered by the Union, for each official language of the Member States;
  • the language skills and qualifications of the content moderators;
  • accuracy indicators and related information for each official language of the Member States.

Important!

ANCOM cannot rule on the legality or illegality of published content that does not fall within its competence. The public authorities or institutions which hold attributions regarding the supervision of a certain sector or field of activity, referred to as relevant authorities, have the competence to issue the orders against illegal content or orders for the provision of information, according to Articles 9 and 10 of the Regulation.

Orders may also be issued by judicial authorities in the context of actions, activities or procedures carried out by them according to their respective legal powers.

ANCOM cannot take measures related to the organization and conduct of elections.

Authorities that have competences in the offline environment in a certain area have the same competences in the online environment. Therefore, authorities can take action against illegal content online depending on the domain they manage.

ANCOM remains available to provide further information within its sphere of competence, to any interested entities.