Carolyn Bunting MBE, co-CEO of Internet Matters, said:
“AI has made it possible to produce highly realistic deepfakes of children with the click of a few buttons. Nude deepfakes are a profound invasion of bodily autonomy and dignity, and their impact can be life shattering. With nudifying tools largely focussed on females, they are having a disproportionate impact on girls.
“Children have told us about the fear they have that this could happen to them without any knowledge and by people they don’t know. They see deepfake image-abuse as a potentially greater violation because it is beyond their control.
“Deepfake image abuse can happen to anybody, at any time. Parents should not be left alone to deal with this concerning issue. It is time for Government and industry to take action to prevent it by cracking down on the companies that produce and promote these tools that are used to abuse children.”
Minister for Safeguarding and Violence Against Women and Girls, Jess Phillips MP, said:
“This government welcomes the work of Internet Matters, which has provided an important insight into how emerging technologies are being misused.
“The misuse of AI technologies to create child sexual abuse material is an increasingly concerning trend. Online deepfake abuse disproportionately impacts woman and girls online, which is why we will work with Internet Matters and other partners to address this as part of our mission to halve violence against women and girls over the next decade.
“Technology companies, including those developing nudifying apps, have a responsibility to ensure their products cannot be misused to create child sexual abuse content or nonconsensual deepfake content.”