Children are using AI to create nudes, warns new report from the Department for Science, Innovation and Technology
A report from the Department for Science, Innovation and Technology (DSIT), and authored by the UK Council for Internet Safety, warns that children are using AI apps designed to produce nudes to create and share this content, reports The Telegraph.
Susie Hargreaves, chief executive of the Internet Watch Foundation, a partner in the UK Safer Internet Centre, said: “The sophistication of AI and the ease with which it can now be used pose very real threats for children. If an image realistically depicts child sexual abuse, whether it is AI or not, it makes the internet a less safe place.
“That real children’s imagery is also being manipulated to create sexual abuse imagery of recognisable children is an appalling threat everyone should be aware of.
“Criminals only need a handful of non-sexual images to create lifelike sexual abuse imagery. The potential for criminals to use this imagery as a sextortion tool to blackmail children into a spiral of further abuse is a terrifying and heart-breaking prospect, and one we must all be taking seriously.”
The public is reminded that the consensual sharing of nudes among under 18s, alongside the making or sharing of AI nudes of fellow classmates, is illegal. Education workers are asked to follow the same procedures to deal with AI-generated material as nudes, by not looking at the content, avoiding deleting it, contacting the police, and waiting for further advice before informing parents.
Read the full article here.