In recent years, and particularly over the last couple of months, the protection of minors online has become a priority for the EU and its Member States. On 26 November, the European Parliament adopted a non-legislative report on the protection of minors online. The report calls for a minimum EU-wide age of 16 to access social media platforms, as well as bans on the most harmful and addictive practices. According to the Parliament’s press release: “To help parents manage their children’s digital presence and ensure age-appropriate online engagement, Parliament proposes a harmonised EU digital minimum age of 16 for access to social media, video-sharing platforms and AI companions, while allowing 13- to 16-year-olds access with parental consent.”
In her State of the Union speech in September, President von der Leyen mentioned that she is closely looking at the actions taken in other regions of the world, such as Australia, and that she will appoint a panel of experts to advise her on the most appropriate approach for Europe. From 10 December, age-restricted social media platforms in Australia are required to take reasonable steps to prevent users under the age of 16 from creating or maintaining an account. COFACE will be monitoring the impact of this policy and is calling for an evidence-based approach in the EU which holds tech companies accountable.
On 4 December, COFACE attended the Safer Internet Forum in Brussels, a key annual international conference where different stakeholders, such as policymakers, young people, parents and carers, teachers, NGOs and industry representatives, come together to discuss the latest trends, opportunities, risks and solutions related to child online safety. This year, the Forum discussed how to ensure age-appropriate online experiences through a proportionate and children’s rights-centric approach. COFACE Director, Elizabeth Gosme, spoke at one of the panel sessions, stating that a safer internet for all is a societal responsibility. Strong laws are needed to regulate the online world and ensure safety. The tech industry must take responsibility for making its products safe. Technologies and platforms that do not respect children’s rights or provide adequate safety measures should not be on the market. She also mentioned that, considering the sometimes challenging and ever-changing family circumstances, it is crucial that technological products incorporate default safeguards to protect the rights of children and young people.
The aim should be to create a digital environment in which children can participate safely and benefit from the opportunities it brings. Complemented by education measures and continued dialogue between parents, carers, children, and other relevant stakeholders, strong regulatory frameworks which demand safety measures from tech companies play a key role. The European Commission initiated the first investigative actions following the Guidelines on Protection of Minors under the Digital Services Act. The Commission requested Snapchat, YouTube, Apple and Google to provide information on their age verification systems, as well as on how they prevent minors from accessing illegal products, including drugs or vapes, or harmful material, such as content promoting eating disorders. Additionally, the EU is stepping up its efforts to fight child sexual abuse. On 26 November, EU member states’ representatives agreed on the Council position on a regulation to prevent and combat child sexual abuse. The new law, once adopted, comes with obligations for tech companies to prevent the dissemination of child sexual abuse material and the solicitation of children.
Further information
- COFACE policy brief – Supporting Families in the Digital Era: How to ensure safe and enriching online experiences for children and their Families? (2024).
- COFACE media release – COFACE launches its revised Digitalisation Principles: Creating a better Internet and web for children and their families (2025).
- COFACE article – State of the European Union 2025: President von der Leyen is listening to parents (2025).
- Press release European Parliament – Children should be at least 16 to access social media, say MEPs (2025).
- Website Australian eSafety Commissioner – Social media age restrictions (2025).
- Website Better Internet for Kids – Safer Internet Forum (2025).
- Press release, European Commission – Commission scrutinises safeguards for minors on Snapchat, YouTube, Apple App Store and Google Play under the Digital Services Act (2025).
- Press release, Council of the EU – Child sexual abuse: Council reaches position on law protecting children from online abuse (2025).
- Safer Internet Forum 2025
For more information, please contact Beatrijs Gelders, Policy and Advocacy Officer on safer internet and digital citizenship: Bgelders@coface-eu.org.





