Internet Regulation Updater
The European Commission (EC) has published draft guidelines outlining measures it expects online platforms to implement to protect minors online. Platforms have until 10 June 2025 to comment on the proposals, and an opportunity to start planning now how to incorporate the expected measures before they come into effect in summer 2025.
Background
The EU Digital Services Act (DSA) regulates online intermediaries and platforms operating in the EU and aims to increase transparency, combat illegal content and protect users' rights. Article 28 of the DSA requires all online platforms to put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors on their service and empowers the EC to issue guidelines to assist platforms to comply.
On 13 May 2025 the EC published its draft guidelines on measures it expects online platforms to put in place to comply with Article 28. Although adopting and implementing the measures will not automatically guarantee compliance with Article 28, the EC has indicated that they will be considered a significant and meaningful benchmark when determining the compliance of online platforms accessible to minors. However, the measures set out in the guidelines are non-exhaustive – the EC may deem it is appropriate and proportionate for an online platform to have additional measures in place.
The consultation period on the draft guidelines runs until 10 June 2025 and the EC is seeking feedback from all relevant stakeholders including children, parents and guardians, national authorities, platform providers and experts. Publication of the final guidelines is expected by the summer of 2025.
We provide below a high-level summary of the EC's draft guidelines and some practical considerations for compliance teams.
A key takeaway is that the EC has delivered a comprehensive set of guidelines with a clear focus on under 18s as a distinct user base requiring age-appropriate safety-by-design and specialised risk management, including a dedicated compliance officer or team responsible for children's safety, privacy and security. This mirrors the focus and approach in other global online safety regulations, such as the UK Online Safety Act.
Summary of key guidelines (non-exhaustive)
General principles
Any measure that a platform accessible to a minor puts in place to comply with Article 28(1) should adhere to the following general principles:
- Proportionality
- Children's Rights enshrined in the Charter and the UN Convention on the Rights of the Child
- Privacy-, safety- and security-by-design
- Age-appropriate design
Measures
Below is a summary of the main measures that the EC considers online platforms should put in place to ensure a high level of privacy, safety and security.
The measures fall into four broad categories: 1. governance, 2. risk review, 3. service design, 4. reporting, user support and tools for guardians:
Governance
In terms of governance, platforms should:
- Assign to a dedicated person or team the responsibility for ensuring a high level of minors’ privacy, safety and security. This person or team should have sufficient resources as well as sufficient authority to have direct access to the senior management body of the provider of the online platform and should also be a central point of contact for regulators and users.
- Provide persons responsible for minors’ privacy, safety and security, developers, persons in charge of moderation and/or those receiving reports or complaints from minors, with relevant training and information.
- Include information specifically relevant to minors in their terms and conditions and in an easily accessible interface on the platform.
- Ensure regular compliance monitoring.
Risk review
In determining which measures are appropriate and proportionate, platforms should:
- Assess the likelihood that minors use the service.
- Identify the risks to minors’ privacy, safety and security using the 5C framework (Content, Conduct, Contact, Consumer and Cross-cutting).
- Consider how platform features contribute to risks.
- Document current and planned mitigations.
- Evaluate potential negative impacts on children's rights (e.g. freedom of expression).
- Review risks regularly and after significant changes to the platform.
- VLOPs/VLOSEs should integrate this into their systemic risk assessments required under Article 34 DSA.
Service design
1. Types of age assurance:
- Self-declaration: Not considered robust or accurate (discouraged).
- Age estimation: Uses algorithms (moderate accuracy).
- Age verification: Based on official IDs or trusted digital credentials (high accuracy).
- The regulator explains when these types of age assurance can be used in different use cases.
- Platform providers should:
- Justify its choice of age assurance type.
- Offer at least two methods if age assurance is required and assess the effectiveness of specific age verification or estimation methods.
- Ensure that where other age verification methods are used, they provide an equivalent level of verification as the EU age verification app (expected in summer 2025).
- Provide a redress mechanism for users to contest wrong age assessments.
- Not process more data than necessary (data minimisation principle).
2. Registration: Where registration is required or offered to access a platform, the provider should:
- Explain to users the rationale for registration
- Ensure registration is easy for all minors to access and navigate.
- Include measures as part of the registration process to help users understand whether they are allowed to use the service and measures to reduce attempts to register if underage.
- Avoid encouraging or enticing users who are below the minimum age required by the platform accessible to minors to create accounts.
- Ensure that it is easy for minors to log out and to have their profile deleted at their request.
- Use the registration process to highlight the safety features of the platform or service, any identified risks to a minor’s privacy, safety or security and resources available to support users.
3. Account Settings:
- Default settings should maximise safety, including ensuring:
- Private profiles, blocked location/mic/camera, limited interactions.
- Push notifications off during sleep hours.
- Filters that affect body image or promote compulsive use are turned off.
- Children are not nudged to lower their settings.
- Platforms may remove settings, features and functionalities from minors' accounts altogether.
4. Recommender systems should:
- Take into account specific needs, characteristics, disabilities and additional accessibility needs of minors when defining the objectives, parameters and evaluation strategies of recommender systems.
- Promote minors’ access to information that is relevant and adequate for them, with due consideration to their age group.
- Prevent a minor's repeated exposure to harmful content (e.g. unrealistic beauty standards or dieting, content that glorifies or trivialises mental health issues, such as anxiety or depression, discriminatory content etc.)
- Use explicit user preferences (e.g. surveys) over engagement-based data.
- Prioritise accounts whose identity has been verified.
- Not use suggested terms and key phrases which recommend content that qualifies as harmful to the privacy, safety or security of minors.
5. Commercial practices: Platforms should:
- Have a responsible marketing and advertising policy in place that does not allow harmful, unethical and unlawful advertising to, for or by minors. This entails considering the appropriateness of advertising campaigns for different age groups.
- Ensure that ads are clearly marked, age-appropriate, and not exploit children’s lack of commercial literacy.
- Ensure that minors are not exposed to hidden or disguised advertising.
- Be transparent about economic transactions in an age-appropriate way and avoid the use of intermediate virtual currencies.
- Ensure that minors are not exposed to practices that can lead to excessive or unwanted spending or addictive behaviours, by ensuring that virtual items such as loot boxes or gambling practices are not used.
6. Moderation tools should:
- Define what is harmful content for minors (e.g. grooming, self-harm, exploitation).
- Use AI safeguards to block harmful prompt generation.
- Consider human review for content that substantially exceeds the average number of views and for any reported accounts that the provider suspects may pose a risk of harm to minors’ privacy, safety or security.
- Take into account the following factors when prioritising moderation:
- the likelihood of the content causing harm to a minor’s privacy, safety and/or security;
- the impact of the harm on that minor; and
- the number of minors who may be harmed.
Reporting, user support and tools for guardians
- User reporting, feedback, complaints, and user support should provide for:
- Prioritisation of reports and complaints submitted by minors and provide each minor that has submitted the report or complaint with their reasoned decision without undue delay, in a way that is adapted to the age of the minor.
- Easy feedback options (e.g. “I don’t like this”, Show me less/more", "I don’t want to see/I am not interested in", "I don’t want to see content from this account," "This makes me feel uncomfortable," "Hide this," "I don’t like this," or "This is not for me”).
- Complaint mechanism availability for non-registered users if they may access the online platform’s content.
- Technical measures to warn minors that they are interacting with an AI system (if AI features and systems such as AI chatbots and filters are integrated into the service of an online platform).
- Tools for guardians should:
- Be treated as complementary (and not replacements for) the platform tools.
- Provide a clear notification to minors of their activation by guardians and put other safeguards in place considering their potential misuse by guardians.
- Ensure changes can only be made with the same degree of authorisation required in the initial activation of the tools.
- Be compatible with the availability of interoperable one-stop-shop tools for guardians gathering all settings and tools.
Practical considerations
Online platforms have an opportunity to provide feedback to the EC on the draft guidelines and are encouraged to do so.
Although the guidelines will not be in force until finalised, their publication in draft provides online platforms with advance notice of the EC's expectations of them when it comes to ensuring minors' safety, privacy and security. Online platforms have an immediate opportunity to:
- Understand how they are currently meeting the requirements of Article 28 DSA as a starting point.
- Analyse the draft guidelines and map them against existing policies and processes (which may be in place to comply with Article 28 DSA and other laws and regulations focused on online child safety such as the US Children's Online Privacy Protection Act, the UK Online Safety Act (including Ofcom's Age Assurance Guidelines), and the ICO’s Age Appropriate Design Code.
- Carry out a gap analysis to identify gaps between current policies and processes and the expectations set out in the guidelines.
- Plan how to incorporate and align the expectations set out in the guidelines with current policies and processes to maximise consistency and efficiency, particularly around training, board reporting, compliance monitoring, identifying a responsible person, and carrying out risk assessments. This will look different for different platforms. For example, VLOPs and VLOSEs are already subject to enhanced obligations under the DSA which include requirements to carry out risk assessments. Other online platforms are not subject to that requirement but may, if in scope of the UK Online Safety Act, currently be carrying out their first Children's Risk Assessment which must be completed by 24 July 2025.
Your contacts
Deloitte continues to support online platforms, including several providers of very large online platforms and search engines, with DSA implementation and ongoing compliance. For more information, please get in touch.
If you would like to speak to the Deloitte team please contact:
Joey Conway, Internet Regulation Partner, Legal Lead
Nick Seeber, Global Internet Regulation Lead Partner
Hilary Atherton, Internet Regulation Director, Legal
Lenka Molins, Internet Regulation Associate Director, Risk and Regulation
Brij Sharma, Internet Regulation Associate Director, Risk and Regulation