On Safer Internet Day, COFACE Families Europe has signed two open letters calling for robust legislation to tackle child sexual abuse and to prohibit nudifying tools.
The first open letter, signed by 109 child rights organisations and partners, calls on the European Union (EU) to take decisive action to protect children online and to pass robust legislation to tackle child sexual abuse and exploitation across Europe.
EU negotiators are currently shaping the Child Sexual Abuse Regulation. The EU has a critical opportunity to be a world leader in protecting children online, but continued inaction and lack of ambition means Europe persists as a safe haven for perpetrators. Without strong laws, Europe will leave millions of children unprotected on online platforms and allow imagery of their abuse to circulate indefinitely online.
For more than three years, EU leaders have debated the Child Sexual Abuse Regulation (CSAR) – a law designed to protect children from online sexual abuse. In that time, Europe’s child sexual abuse crisis has escalated dramatically: reports of grooming, sexual extortion and AI-generated child abuse material have risen exponentially across Europe.
We therefore call for a prompt, ambitious Regulation that protects children from today’s and tomorrow’s threats. It must mandate reporting to the EU Centre and include prompt removal of CSAM notices.
Read the full open letter here: Open Letter: Protect Children’s Online Safety in Europe.
The second open letter, endorsed by 107 organisations, institutions and individual experts, calls upon governments and legislators to urgently enact and enforce regulation, at the latest within the next two years, to prohibit nudifying tools and ensure they are universally inaccessible.
Nudifying tools use Artificial Intelligence (AI) to generate nude images from clothed photos. Though mainly marketed for adults, these tools are often misused to create non-consensual nudified images of people and are increasingly being used to produce illegal images of children. The companies, developers, and individuals who create or distribute them must be held accountable and face legal and criminal consequences.
Deepfakes are now closely linked to sexual coercion, extortion, and blackmail. The alarming ease of these AI tools means abusers no longer need to obtain photographic intimate images; they can create them artificially, at scale, with terrifying efficiency. Offenders, including young people themselves, are already monetising these images, creating new abusive economies.
Signatories call on institutions, companies, and citizens to take immediate action towards eliminating the use of nudifying tools by committing to the following:
- Recognise thatnudifyingtools inflict irreparable harm on individuals, enabling the indefensible abuse and exploitation of women and children in particular.
- Acknowledge the profound societal damage this functionality causes, including the normalisation of explicit imagery and gender-based violence, as well as the erosion of trust and safety.
- Demand accountability and innovation, requiring technology companies to implement safety by design, fast-track the development and delivery of effective protections, and provide clear transparency.
Read the full open letter here: Unifying Voices Worldwide: No to Nudify.
Notes to editor:





