Loading...

Last Updated on 28th April 2026

Reading Time: 6.3 mins

April 28, 2026

Share this with your friends, family, and colleagues

AI tools that generate fake explicit images of real people are now widely accessible and children, young people, and school staff are being targeted. Here is what you need to know and what you can do about it.

1 in 5

reports of nude imagery of young people now involve faked, edited or AI-altered content

99%

of nude deepfakes online feature women or girls, and 98% are sexual

Seconds

is all it takes to generate a convincing fake image using a photo taken from social media

Understanding nudification platforms

Nudification platforms, sometimes called deepnude tools, are websites or apps that use artificial intelligence to digitally remove clothing from photographs of real people, creating fake nude, semi-nude or sexualised images without their knowledge or consent. They require very little technical skill or editing expertise as a user simply uploads a photo or video, and the AI generates a false but convincing image at the click of a button. Alternatively, to distance themselves from accountability or to circumvent a platform’s rules, some people send nudification requests into group chats or forums for others to generate the desired image.

Crucially, these tools cannot see through clothes or remove actual clothing. Instead, they generate an entirely artificial body. Despite this, the results can appear highly realistic and believable. Innocent, everyday photos from private messages or social media have been manipulated in this way.

IMPORTANT TO UNDERSTAND

Although nudification tools violate most app store policies, they continue to appear in mainstream stores and online searches. Age checks are often limited to users’ self-declaration, and some websites require no account at all. Children can encounter these tools through ads or social media, without actively searching for them. Between November 2025 and January 2026, Meta removed over 344,000 ads across Facebook and Instagram that attempted to promote nudification apps.

Children, young people and school staff

Anyone can be affected, but the evidence shows a clear pattern: girls and young women are disproportionately targeted. Research from Internet Matters indicates that the overwhelming majority of nude deepfakes online feature women or girls, and nearly all are non-consensual.

In school communities, nudification has been linked to bullying, harassment and child-on-child abuse. Staff are also at risk; incidents have caused anxiety, reputational harm and some teachers have left their jobs after such images have circulated. Schools have reported cases where photos taken from school websites, social media or staff directories have been used to target both pupils and staff.

Beyond the classroom, fabricated images can be used to coerce, pressure or extort young people. Because they appear real, they can silence or manipulate victims, causing fear, shame and lasting emotional distress even when the content is entirely artificial.

Where the UK stands legally

The legal framework is strengthening rapidly. In December 2025, the UK Government announced plans to ban the creation and supply of nudification tools, making it illegal for companies to develop or profit from them, as part of the Violence Against Women and Girls (VAWG) strategy.

This builds on existing protections under the Online Safety Act 2023, which made sharing, or threatening to share, non-consensual intimate images a criminal offence. The Data (Use and Access) Act 2025 went further, criminalising the creation of such images, as well as requesting their creation. This offence was brought into force in January 2026 and designated a priority offence under the Online Safety Act, meaning platforms now have a legal duty to proactively prevent and remove this content – and the non-consensual use of nudification tools is itself a crime.

WHERE CHILDREN ARE INVOLVED

Any sexualised image of a person under 18 is classified as Child Sexual Abuse Material (CSAM) including AI-generated or digitally altered content. Creating, possessing or sharing such imagery is illegal.

The real and lasting harm

The impact on victims can be severe and long-lasting. Being the subject of a fake explicit image, even one that is known to be fabricated, causes significant psychological distress, including anxiety, depression, and in some cases self-harm. Victims often describe feelings of humiliation, powerlessness and profound violation.

“Nudification harms victims, their families and also the young people who create and share these images. Prevention, early intervention and clear guidance are crucial.”

Jim Gamble, CEO, INEQE Safeguarding Group

Starting the conversation at home

You do not need to be a tech expert. Calm, open conversations make the most difference and they signal to young people that they can come to you without fear of judgement.

TRY THESE WITH YOUR CHILD OR YOUNG PERSON

  • “Have you heard of apps or websites that use AI to change photos of people? What do you know about them?”
  • “If someone showed you an image that looked real but might not be, what would you do?”
  • “If anyone ever used a photo of you in a way that made you uncomfortable, I want you to know you can come straight to me – no judgement, no blame.”
  • “Do you know what to do if you see something like that happening to someone else?”

What parents, carers and professionals can do

PARENTS AND CARERS

  • Review privacy settings on your child’s social media, restrict who can view and save their photos.
  • Talk openly about consent, respect and the impact of online actions on real people.
  • Reassure victims: the image is fabricated, it does not reflect their real body or identity.
  • If your child has been involved in creating such images, respond calmly, they need guidance alongside consequences.
  • Use SafeSearch and device-level filtering to reduce exposure to harmful tools.

TEACHING AND SAFEGUARDING PROFESSIONALS

  • Ensure your safeguarding policy explicitly addresses AI-generated imagery and image-based abuse.
  • Review the public visibility of pupil and staff images on your website and social media. Ensure consent is time-bound, requires dual consent where necessary, and clearly outlines how and where images will be used.
  • Deliver age-appropriate education on consent, digital rights, and the consequences of image-based abuse.
  • Follow your organisation’s safeguarding and child protection procedures if a report is made, treat it as seriously as any image-based abuse.
  • Support both the victim and, where appropriate, the young person who created the image.
  • Make Report Remove famous in your school through posters and regular discussions about how they help remove nude images of under-18s.

REPORT REMOVE | A KEY TOOL FOR UNDER-18S

  • If a young person discovers an image of them has been altered, Report Remove can help. Created by Childline and the Internet Watch Foundation, it allows under-18s to request removal of sexual or AI-altered images of themselves from the internet.
  • Visit childline.org.uk/report-remove or call Childline on 0800 1111

Where to go for help

Internet Watch Foundation and Childline

Report non-consensual intimate images for removal

INEQE School Guide AI Image Exploitation

Detailed guidance on preventing and responding to incidents

CEOP

Report child sexual exploitation and online grooming

999 (immediate risk) · 101 (non-emergency)

Report to police where a crime has been committed

FINAL THOUGHT

Technology moves fast, but the values we teach children about respect, empathy and consent are timeless. Talking to young people about nudification tools is not about frightening them, it is about equipping them to make better choices, to look out for one another, and to know that support is there if something goes wrong.

Share this with your friends, family, and colleagues

This article was produced by INEQE Safeguarding Group, incorporating Safer Schools. For more safeguarding resources, training, and school support, visit ineqe.com or our Online Safety Centre: oursafetycentre.co.uk

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Discussing Online Life With Your Child

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

2026-04-28T15:26:11+00:00
Go to Top