Please enable javascript in your browser to view this site

Protecting children under the DSA

The EC’s draft guidelines on protecting underage users set a clear expectation that platforms should be fostering safety within their services beyond simple age gates 

The EC has published its draft guidelines on protecting children under the Digital Services Act

On 13 May 2025, the EC opened a public consultation on its proposed guidelines for the protection of minors under the Digital Services Act (DSA). Under Article 28 of the DSA, platform services of all sizes that are accessible to children, excluding those that qualify as small or micro enterprises, are required to implement measures to “ensure a high level of privacy, safety, and security of minors”. Through the same article, the EC is empowered to issue guidelines in consultation with the European Board for Digital Services to assist providers in developing appropriate and proportionate measures to meet their obligations for child safety online. In announcing its consultation, the EC emphasised that its proposal adopts a risk-based approach in understanding the harms children face online and was developed in collaboration with advocates, tech firms, civil society and academia. In addition to these guidelines, the EC is currently developing an interim age assurance solution for verifying users over the age of 18 that is expected to be available by summer 2025, as well as planning for the development of a Digital Fairness Act. The consultation on the guidelines will remain open for feedback until 10 June 2025.

All of the recommendations are underpinned by four principles and to be applied on the basis of risk assessments by platforms

The guidelines are framed around four core principles with which any measure that platforms put into place under the DSA should comply: 

  1. Proportionality: Matching risk to mitigation measures through an assessment of each platforms’ profile;

  2. Children’s rights: Balancing minors’ rights to protection, non-discrimination, inclusion, participation, privacy, information and freedom of expression;

  3. Privacy-, safety- and security-by-design: Integrating the aims of child protection into the design and development of services; and

  4. Age-appropriate design: Aligning service design with the developmental, cognitive, and emotional needs of minors.

All platforms are expected to complete a risk assessment to underpin their response to Article 28, which should address the likelihood a minor accesses their services, the mitigations already in place, an assessment of any impacts on children’s rights and a review of risks using the so-called ‘5Cs’ typology (content, conduct, contact, consumer and cross-cutting). 

Age assurance isn’t necessary for every service but self declaration will not be considered an effective method if assurance is needed

Before offering recommendations on the functionality of services, the EC’s guidelines outline the considerations platforms are expected to make when choosing to implement (or forgoing) age assurance methods. The EC notes that age assurance may not be necessary for all platforms, especially considering its potential implications for users’ rights including privacy and freedom of speech, and adds that other mitigating measures may be effective at preventing harm to underage users in the absence of age assurance. However, for platforms that have identified a degree of risk that would require children to be prevented from accessing services or certain features of a service, the guidelines delineate between appropriate circumstances for the use of age verification versus age estimation. For platforms that offer legally restricted goods or content, such as gambling or pornography, as well as platforms that pose such a high risk that only a strict age gate could protect minors from harm, the EC recommends stricter age verification based on trusted identifiers, including government issued IDs such as the EU’s forthcoming age assurance solution and eventually the EU Digital Identity Wallet. For platforms that pose medium risks and generally allow users under-18 but above another indicated age to access their services, the EC deems age estimation to also be appropriate. Any age assurance method should be assessed on its accuracy, reliability, robustness, non-intrusiveness and non-discrimination, and the EC states that self declaration of age cannot meet standards of accuracy and robustness under any circumstance and therefore is not considered effective for any platform that finds it necessary to implement assurance.

Platforms should modify account defaults and recommender systems for minor users

The EC also recommends a number of service functionality changes to be applied to minor accounts to assist in mitigating risk. The guidelines recommend that children’s accounts should be set to private as a default to prevent unsolicited contact from unknown users and that underage users should be able to block and mute any user and should not be added to any groups without explicit consent to reduce the risk of cyberbullying. To address the potential harms of content recommender systems, the EC suggests that platforms rely on explicit, user-provided signals over implicit behaviour to target content to minors. Specifically, the EC warns against ongoing surveillance of minor users' behaviours while using services and using implicit signals, such as watch time or click through rates, to inform recommender systems. According to the guidelines, minors should be given the option to permanently and completely reset their recommended feeds and to access a recommender system that is not based on profiling. Through these among other recommendations, the EC sets out how it expects platforms to develop age appropriate experiences for minor users and develop safer online environments beyond simply age gating content and services that may pose risks.