The Guidelines on the protection of minors online, developed by the European Commission under the Digital Services Act (DSA), set out a non-exhaustive list of proportionate and appropriate measures to protect children from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices.
The Guidelines will apply to all online platforms accessible to minors, with the exception of micro and small enterprises. Key recommendations include the following:
- Setting minors' accounts to private by default so their personal information, data, and social media content is hidden from those they aren't connected with to reduce the risk of unsolicited contact by strangers.
- Modifying the platforms’ recommender systems to lower the risk of children encountering harmful content or getting stuck in rabbit holes of specific content. Platforms are recommended to empower children to be more in control of their feeds.
- Empowering children to be able to block and mute any user and ensuring they can't be added to groups without their explicit consent, which could help prevent cyberbullying.
- Prohibiting accounts from downloading or taking screenshots of content posted by minors to prevent the unwanted distribution of sexualised or intimate content and sexual extortion.
- Disabling by default features that contribute to excessive use, like communication "streaks," ephemeral content, "read receipts," autoplay, or push notifications. Moreover, platforms are recommended to remove persuasive design features aimed predominantly at engagement and putting safeguards around AI chatbots integrated into online platforms.
- Ensuring that children’s lack of commercial literacy is not exploited and that they are not exposed to commercial practices that may be manipulative, lead to unwanted spending or addictive behaviours, including certain virtual currencies or loot-boxes.
- Introducing measures to improve moderation and reporting tools, requiring prompt feedback, and minimum requirements for parental control tools.
The Guidelines also recommend the use of effective age assurance methods provided that they are accurate, reliable, robust, non-intrusive, and non-discriminatory – in particular as regards restricting access to adult content (such as pornography and gambling), or when national rules set a minimum age to access certain services. The blueprint for age verification on which applications can be built, as well as the soon available EU Digital Identity Wallets, will provide compliance examples and reference standards for a device-based method of age verification. In other cases, such as when terms and conditions prescribe a minimum age lower than 18 due to identified risks to minors, the Guidelines recommend age estimation.
Like the DSA, the Guidelines adopt a risk-based approach, recognising that online platforms may pose different types of risks to minors, depending on their nature, size, purpose, and user base. The Guidelines enshrine a safety and privacy by design approach and are grounded in children’s rights. Platforms should ensure that the measures they take do not disproportionately or unduly restrict children’s rights.
The Commission and ANCOM will use these Guidelines to assess compliance with Article 28 (1) of the DSA by the providers of intermediary services. However, following these Guidelines is voluntary and does not automatically guarantee compliance with the DSA.