MENU

New report estimates half a million UK teenagers have encountered AI-generated nude deepfakes

A close up of a child's hands holding a smartphone.

New report by Internet Matters urges the new Government to crack down on AI-generated sexual imagery ‘epidemic’ after survey reveals 13% of teenagers have had an experience with nude deepfakes.

Summary

  • The possibility of uncontrolled nude deepfake abuse has sown fear into many children’s lives; over half of teenagers (55%) believe that it would be worse to have a deepfake nude created and shared of them than a real image.
  • Internet Matters calls for a ban on ‘nudifying’ tools, as the report warns it is simply too easy to create nude deepfakes online.
  • Recommended reforms to the school curriculum should include teaching children how to identify deepfakes and to use AI tech responsibly.

Children's experiences of nude deep fakes

Britain’s leading not-for-profit supporting children’s online safety, is today publishing a new report, “The new face of digital abuse: Children’s experiences of nude deep fakes”.

Tackling generative AI fake nudes

The report estimates that four children in every class of thirty have had some experience of ‘nudified deepfakes’ and calls on the Government to introduce new legislation to tackle the growth of generative AI (GenAI) fake nudes of young people, particularly girls, by banning so-called ‘nudifying’ apps.

The report explains the ways in which GenAI has made it quick, simple and cheap to produce non-consensual explicit imagery of ordinary people – including children. It argues that current legislation is not keeping pace; the AI models used to generate sexual imagery of children are currently not illegal in the UK, despite possession of a deepfake sexual image of a child being a criminal offence.

While deepfake technology can be used for positive purposes, such as education or training, evidence suggests that the majority of deepfakes are created to harm, including through sexual abuse, financially motivated sexual exploitation, mis- and disinformation, scams or fraud. Once a phenomenon largely aimed at female celebrities, the creation of fake nude images is something that is now more widely accessible and impacting children, with girls most likely to be the target.

Nudifying tools often used to create deepfakes of women and girls

Today’s Internet Matters report highlights how widely available and easy to use ‘nudifying’ tools have become to use online. An estimated 99% of nude deepfakes feature girls and women, and ‘nudifying’ models often don’t work on images of boys and men. AI-generated sexual images featuring children have been used to facilitate child-on-child sexual abuse, adult-perpetrated sexual abuse and sextortion and can impact victims profoundly, leading to the onset of anxiety, depression and suicidal thoughts.

Teenage boys have used ‘nudification’ apps to generate sexually explicit images of their female classmates, often sharing these images on group chats and social media. Girls told Internet Matters they would feel horrified and ashamed and fear that a teacher or parent could think they were genuine.

Today’s report details an Internet Matters nationally representative survey of 2,000 parents of children aged 3-17 and 1,000 children aged 9-17 in the UK, conducted by Opinium in June 2024. The survey shows:

  • A significant number of children have some kind of experience with a nude deepfake. Overall, 13% of children have had an experience with a nude deepfake (including sending or receiving one, encountering a nude deepfake online, using a nudifying app or someone they know having used a nudifying app). This means around half a million (529,632) teenagers in the UK, or 4 teenagers in a class of 30, have had an experience with a nude deepfake.
  • Teenage boys (18%) are twice as likely as teenage girls (9%) to report an experience with a nude deepfake. However, boys are more likely to be creators of deepfake nudes, and girls are more likely to be the victims.
  • Boys and vulnerable children are more likely to have engaged with a nude deepfake. 10% of boys aged 13-17 have come across a nude deepfake online, compared to 2% of girls the same age. A quarter of vulnerable children have had an experience with a nude deepfake, compared to 11% of non-vulnerable children.
  • Teenagers see nude deepfake abuse as worse than image abuse featuring real pictures. Over half of teenagers (55%) believe that it would be worse to have a deepfake nude created and shared of them than a real image.
    Most families have little to no understanding of deepfakes, with almost two thirds of children (61%) and almost half of parents (45%) saying that they don’t know or understand the term ‘deepfake’.
  • Families want Government and tech companies to do more to tackle nude deepfakes. 84% of teenagers and 80% of parents believe nudifying tools should be banned for everyone in the UK, including adults.
  • Families also agree that more education is needed on the topic of deepfakes. Only 11% of teenagers have been taught about deepfakes in school, and just 6% about nude deepfakes. The overwhelming majority of teenagers (92%) and parents (88%) feel that children should be taught about the risks of deepfakes in school.

The report argues for both new legislation and industry action to protect children from deepfake sexual abuse and argues that parents and schools cannot be expected to protect children alone. It calls for action, including:

The Government to ban nudifying tools as a priority in this Parliament and strengthen the Online Safety Act with codes of practice on gendered harm and child-on-child abuse.

Tech firms to take firmer action by removing access to nudifying tools on search engines and app stores.
The national curriculum to be updated to integrate teaching on critical media literacy.

To coincide with and support today’s report, Internet Matters is launching new resources including expert advice for parents on protecting children from deepfakes.

Carolyn Bunting MBE, co-CEO of Internet Matters, said:

“AI has made it possible to produce highly realistic deepfakes of children with the click of a few buttons. Nude deepfakes are a profound invasion of bodily autonomy and dignity, and their impact can be life shattering. With nudifying tools largely focussed on females, they are having a disproportionate impact on girls.

“Children have told us about the fear they have that this could happen to them without any knowledge and by people they don’t know. They see deepfake image-abuse as a potentially greater violation because it is beyond their control.

“Deepfake image abuse can happen to anybody, at any time. Parents should not be left alone to deal with this concerning issue. It is time for Government and industry to take action to prevent it by cracking down on the companies that produce and promote these tools that are used to abuse children.”

Minister for Safeguarding and Violence Against Women and Girls, Jess Phillips MP, said:

“This government welcomes the work of Internet Matters, which has provided an important insight into how emerging technologies are being misused.

“The misuse of AI technologies to create child sexual abuse material is an increasingly concerning trend. Online deepfake abuse disproportionately impacts woman and girls online, which is why we will work with Internet Matters and other partners to address this as part of our mission to halve violence against women and girls over the next decade.

“Technology companies, including those developing nudifying apps, have a responsibility to ensure their products cannot be misused to create child sexual abuse content or nonconsensual deepfake content.”

Recent posts