Important!
Important! Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) are those with more than 45 million monthly active users in the European Union and that have been designated as such by the European Commission — as is the case for Facebook, TikTok, YouTube, Instagram, X, LinkedIn, and others. They fall under the European Commission’s direct supervision for the most stringent obligations set out in the Digital Services Act. For the remaining obligations under the DSA, which also apply to smaller intermediary service providers, supervisory responsibilities are shared with the Digital Services Coordinator in the country where the provider is established (in the case of these major platforms, this is Coimisiún na Meán (CnaM) of Ireland).
Notice and action mechanism
Hosting service providers — a category that includes online platforms — are required to implement mechanisms that allow any natural or legal person to notify them of content the latter consider to be illegal. This enables users to report to online platforms (and other hosting service providers) any type of online content that may violate Romanian legislation on electoral matters.
Obligation to apply Terms and Conditions diligently, objectively and proportionately
Under the Digital Services Act, all intermediary service providers must apply their Terms and Conditions in a diligent, objective, and proportionate manner. In the context of elections and pre-election periods, it should be noted that certain very large online platforms — such as Facebook, Instagram, TikTok, YouTube, X, LinkedIn — prohibit political advertising. Thus, users may report to these platforms content that could be considered political advertising.
ANCOM, designated by law as Romania’s Digital Services Coordinator, is responsible for supervising intermediary service providers established within the country and ensuring their compliance with the Digital Services Act.
The Authority oversees all aspects of supervision and enforcement of the Digital Services Act in relation to these providers. It is also responsible for developing secondary legislation and serves as the single point of contact for communication with the European Commission and with Digital Services Coordinators in other EU Member States.
In cases of non-compliance with the Digital Services Act by intermediary service providers established in Romania, ANCOM may apply sanctions, including significant fines.
Handling complaints about breaches of the Digital Services Act
Users in Romania who believe that intermediary service providers have failed to comply with their obligations under the Digital Services Act — for example, by not enabling the reporting of potentially illegal content or by failing to provide access to an internal complaint-handling system — may submit a complaint to ANCOM, either directly or through a body, organisation, or association mandated to represent them, in accordance with Article 53 of the Digital Services Act. If the complaint concerns a provider that is not established in Romania, ANCOM will forward it to the Digital Services Coordinator in the Member State where the provider is established.
Identification and mitigation of systemic risks by VLOPs and VLOSEs
VLOPs) and VLOSEs are required to identify and assess systemic risks, such as: the dissemination of illegal content, adverse effects on the exercise of fundamental rights, negative impacts on civic discourse and electoral processes, as well as on public security, etc. In accordance with Articles 34 and 35 of the Digital Services Act, these providers must also implement reasonable, proportionate, and effective measures to mitigate the identified systemic risks.
In the context of electoral processes, two key documents are relevant and should guide the actions of VLOPs and VLOSEs in managing the risks associated with these processes:
- Guidelines for providers of Very Large Online Platforms and Very Large Online Search Engines on the mitigation of systemic risks for electoral processes.
- Code of Conduct on CounteringIllegal Hate Speech Online.
Important!
The assessment of whether content related to political advertising on online platforms is legal or illegal falls under the responsibility of the competent authorities — that is, public bodies or institutions with supervisory powers in a specific sector or field. These authorities are empowered to issue orders to act against illegal content or orders to provide information.
Judicial authorities may also issue such orders in the context of legal proceedings or investigations they carry out in accordance with their statutory powers.
Briefly, authorities that hold offline competences in a particular domain retain those same competences in the online environment. Therefore, they can take appropriate action against illegal online content, depending on the domain they oversee.
