Leave quickly

What is Deepfake Pornography?

Deepfake pornography is a form of image-based sexual abuse where artificial intelligence (AI) is used to create or alter sexual images or videos so that they appear to show a real person without their consent. This can include digitally removing clothing, making an image move sexually, or placing someone’s face onto explicit material.

While deepfakes are often discussed in relation to celebrities, growing evidence shows that children and young people are increasingly being targeted – often by their peers. Teachers, safeguarding leads and researchers are warning that this abuse is becoming normalised in school settings, with serious and lasting harm.

What is happening in schools?

Recent research and reporting show that deepfake pornography is no longer rare or isolated.

  • Teachers across England report pupils using “nudify” apps to create fake sexual images of classmates.
  • Most incidents involve girls aged 14 or under, with cases reported involving children as young as 11.
  • Similar incidents have been documented in the Spain, Australia and the United States.
  • Surveys of young people show that many have already seen sexually explicit deepfake images of friends, teachers, celebrities or themselves.

In some cases, schools and parents choose not to tell the child who has been targeted because of the stigma and fear of further harm. While this may feel protective, it can also leave victims without support or understanding of what has happened to them.

Gender, power and consent

Evidence consistently shows that the vast majority of sexually explicit deepfakes online are of women and girls. This reflects wider patterns of misogyny, sexual entitlement and gender-based violence.

Researchers and educators warn that:

  • Deepfake tools are being trivialised as jokes, filters or “pranks”.
  • Social media platforms normalise harmful uses of AI.
  • These practices undermine consent and personal boundaries.

However, boys and young men can also be targeted, and their experiences must not be dismissed. Any non-consensual sexual image – regardless of gender – is abuse.

Why deepfake pornography causes such serious harm

Deepfake sexual images can be devastating. Survivors often describe feeling violated, humiliated and powerless. Unlike other forms of online abuse, deepfakes can feel intensely personal because the image looks like you.

 

Research and survivor testimony highlight impacts including:

For children and young people, the harm can be compounded by:

Peer pressure and gossip within schools

Limited understanding of consent and the law

Inconsistent responses from schools and authorities

A lack of specialist support

Deepfake abuse also draws young people – often boys – into serious criminal behaviour, sometimes without them fully understanding the consequences of their actions.

Judge's Gavel

What Does The Law Say?

In the UK, creating or sharing sexual images of someone without their consent may be a criminal offence, especially where the person is under 18. This can include offences relating to:

  • Image-based sexual abuse
  • Harassment and stalking
  • Child sexual abuse material (if a child is depicted)

The law is still struggling to keep pace with rapidly developing AI technology, and enforcement can be inconsistent. This uncertainty can leave survivors feeling unprotected and unsure where to turn.

If you are unsure whether a crime has been committed, you can speak to Safeline confidentially for guidance.

Support for parents, carers and professionals

If you are supporting a child or young person affected by deepfake abuse:

  • Take their feelings seriously
  • Avoid minimising or dismissing what has happened
  • Reassure them that they are not at fault
  • Seek specialist support early

A trauma-informed response can make a significant difference to recovery.

Experts agree that banning technology alone will not solve the problem. Young people already have access to these tools.

What does help:

  • Clear education about consent, respect and sexual ethics.
  • Media literacy and critical understanding of AI.
  • Honest conversations about harm and accountability.
  • Adults feeling confident to address these issues directly.

Cases have shown that education works. Young people who understand online abuse are more likely to recognise it and report it.

References

This resource draws on research, reporting and expert analysis from the following sources:

Safeline also draws on its direct work with survivors of sexual abuse and exploitation to inform this content.

Safeline
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.