Call us now 01926 402 498
Deepfake pornography is a growing form of image-based sexual abuse, increasingly affecting children and young people in schools.
Deepfake pornography is a form of image-based sexual abuse where artificial intelligence (AI) is used to create or alter sexual images or videos so that they appear to show a real person without their consent. This can include digitally removing clothing, making an image move sexually, or placing someone’s face onto explicit material.
While deepfakes are often discussed in relation to celebrities, growing evidence shows that children and young people are increasingly being targeted – often by their peers. Teachers, safeguarding leads and researchers are warning that this abuse is becoming normalised in school settings, with serious and lasting harm.
Recent research and reporting show that deepfake pornography is no longer rare or isolated.
In some cases, schools and parents choose not to tell the child who has been targeted because of the stigma and fear of further harm. While this may feel protective, it can also leave victims without support or understanding of what has happened to them.
Evidence consistently shows that the vast majority of sexually explicit deepfakes online are of women and girls. This reflects wider patterns of misogyny, sexual entitlement and gender-based violence.
Researchers and educators warn that:
However, boys and young men can also be targeted, and their experiences must not be dismissed. Any non-consensual sexual image – regardless of gender – is abuse.
Deepfake sexual images can be devastating. Survivors often describe feeling violated, humiliated and powerless. Unlike other forms of online abuse, deepfakes can feel intensely personal because the image looks like you.
Deepfake abuse also draws young people – often boys – into serious criminal behaviour, sometimes without them fully understanding the consequences of their actions.
In the UK, creating or sharing sexual images of someone without their consent may be a criminal offence, especially where the person is under 18. This can include offences relating to:
The law is still struggling to keep pace with rapidly developing AI technology, and enforcement can be inconsistent. This uncertainty can leave survivors feeling unprotected and unsure where to turn.
If you are unsure whether a crime has been committed, you can speak to Safeline confidentially for guidance.
If you are supporting a child or young person affected by deepfake abuse:
A trauma-informed response can make a significant difference to recovery.
Experts agree that banning technology alone will not solve the problem. Young people already have access to these tools.
What does help:
Cases have shown that education works. Young people who understand online abuse are more likely to recognise it and report it.
If someone has created or shared a sexual image of you – or someone you know – without consent, you are not to blame.
You may be feeling confused, ashamed or frightened – especially if the person responsible is someone you know. These feelings are common, but you do not have to deal with this alone.
This resource draws on research, reporting and expert analysis from the following sources:
Safeline also draws on its direct work with survivors of sexual abuse and exploitation to inform this content.