Safety groups say images are so lifelike that it can be hard to see if real children were subject to harms in production
The volume of sexually explicit images of children being generated by predators using artificial intelligence is overwhelming law enforcement’s capabilities to identify and rescue real-life victims, child safety experts warn.
Prosecutors and child safety groups working to combat crimes against children say AI-generated images have become so lifelike that in some cases it is difficult to determine whether real children have been subjected to real harms for their production. A single AI model can generate tens of thousands of new images in a short amount of time, and this content has begun to flood both the dark web and seep into the mainstream internet.
More Stories
Elon Musk’s company town: SpaceX employees vote to create ‘Starbase’
Scientists record seismic tremors from title-clinching Liverpool win over Spurs
The big idea: can we stop AI making humans obsolete?