Can social media become more ‘social’?
A call for social media which respect human rights, are transparent and implement a safety-by-design approach
Media release
Safer Internet Day
Brussels, 10th February 2026
On this Safer Internet Day, COFACE Families Europe is calling on providers of social media platforms to make sure that their platforms are safe, secure and respect people’s privacy and human rights. The European Union and its Member States must play a key role in holding these companies to account if they fail to do so. This includes implementing and enforcing strong regulations, such as the Digital Services Act (DSA), but also the Artificial Intelligence Act (AI Act) and the General Data Protection Regulation (GDPR), which are currently at risk of being undermined by the proposed Digital Omnibus Regulation.
_________________________________________________________________________________________
We could question whether social media should still be called ‘social’. In fact, most major platforms could currently be categorised as ‘unsocial’ rather than ‘social’. It is clear that big social media companies are choosing profit over safety, which poses many risks to their users, especially children and young people. Nevertheless, children and young people have the right to participate in the online environment, from which they can benefit in terms of personal development, education, civic participation, access to information, and support services.
During the Safer Internet Forum in December 2025, different stakeholders came together in Brussels (Belgium) to discuss how to ensure age-appropriate online experiences through a proportionate, child-centred approach. 1 COFACE Families Europe stated that a safer internet for all is a societal responsibility. Considering the sometimes challenging and ever-changing family circumstances, it is crucial for technological products and services to incorporate default safeguards which protect the rights of children and young people.2 Other stakeholders also emphasised that rather than banning children from using social media, unsafe social media platforms that do not respect children’s rights should be banned instead.
Technologies and platforms that fail to respect children’s rights or provide adequate safety measures should not be on the market. The DSA and its Guidelines on the protection of minors have great potential to hold tech companies accountable and to request that companies implement effective safety measures that go beyond teen accounts and parental control tools. Social media could actually become social, if providers of online platforms were to respect human rights, be more transparent and implement a safety-by-design approach instead of encouraging hate through their recommender systems and allowing users to generate child sexual abuse material through their AI tools while exploiting users’ data for profit.
In 2025, members of the COFACE Families Europe network from 8 countries (Austria, Bulgaria, Finland, Greece, Hungary, Poland, Slovenia, Spain) organised consultations with parents and caregivers to discuss the Better Internet for Kids Strategy (BIK+ Strategy) and its three pillars: Safe digital experiences, Digital empowerment and Active participation.3 Among other things, parents shared that they felt overwhelmed and are ‘constantly playing catch-up’ with the latest technological developments.
Antonia Torrens, COFACE President:
“Parents and caregivers can take many age-appropriate measures themselves, acting as mediators between their children and the online environment. They can accompany their children in their first experiences online, help them to make sense of both the positive and negative experiences, and model healthy online behaviour themselves. However, families are all different, and many have limited time and resources, meaning they can only do so much. They therefore need strong regulations to support them in ensuring a safe and socially beneficial digital environment for their children, so that children and their families can enjoy the opportunities that social media offers.”
The European Commission should fully safeguard children’s rights in the online environment and establish the protections as enshrined in current EU digital regulations, such as the GDPR and AI Act. Weakening these protections in any way – for example by delaying implementation of key components of the AI Act or introducing loopholes allowing for the exploitation of children’s data for training AI systems – would be an unacceptable betrayal of the EU’s commitment to children, parents, and its own values.4
COFACE Families Europe welcomes the EU’s efforts to tackle specific online harms and believes that the EU has a critical opportunity to become a world leader in the protection of children online. Nevertheless, millions of children remain unprotected from sexual exploitation and repeated victimisation on online platforms. The EU must pass robust legislation to tackle child sexual abuse and exploitation across Europe. COFACE stands ready for continued support to the European Union institutions in order to better protect children and young people in the online environment.
ENDS//
Notes to the editor:
- EU Digital Omnibus proposal (2025)
- Open consultation to give feedback on the EU Digital Omnibus proposal (deadline 13 March 2026)
- Safer Internet Forum 2025 – Why age matters: Protecting and empowering youth in the digital age (2025)
For more information, please contact Beatrijs Gelders, Senior Policy and Advocacy Officer bgelders@coface-eu.org.





