Global collaboration needed as thousands of AI-generated child sexual abuse images emerge depicting the worst kinds of abuse
The Internet Watch Foundation (IWF), a partner in the UK Safer Internet Centre, has published its latest report on the proliferation of AI-generated child sexual abuse imagery, following a month-long investigation of a dark web forum. The data shows that most of the AI images are realistic enough to be considered as real imagery under UK law.
The investigation found that:
- Of the 11,108 AI images investigated by IWF analysts, which had been shared on a dark web child abuse forum, 2,978 were confirmed as depicting child sexual abuse and breaching UK law.
- Of these images, 2,562 were realistic enough to be treated as real abuse images.
The IWF has confirmed that the technology is being abused to generate new imagery of real victims of these hideous crimes, whose faces and bodies have been fed to AI models to train the system into producing new abusive imagery of them.
Also, IWF trained analysts have seen examples of images of celebrities, who have been “de-aged” using AI technology and presented as children in various scenarios of sexual abuse.
Images of fully clothed children and young people, uploaded legally online, have also been abused to “nudify” these children. Some of these images have also been commercialised.
Worryingly, the most realistic examples of these images are almost indistinguishable from real imagery of children, even to the expert eyes of IWF trained analysts. The organisation warns of the challenges associated with the advances in this technology, as text-to image technology will only improve.
IWF CEO Susie Hargreaves OBE, said: “Our worst nightmares have come true. Earlier this year, we warned AI imagery could soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers. We have now passed that point.
“International collaboration is vital. It is an urgent problem which needs action now. If we don’t get a grip on this threat, this material threatens to overwhelm the internet.”
Read the full article here.