A recent investigation by the Internet Watch Foundation has found that paedophiles are using AI to create images of celebrities as children, as well as manipulate photographs of child actors and generate scores of images of real victims of child sex abuse. The report details how 11,000 AI images were shared on a single darknet child abuse website, 3000 of which would be illegal under UK law.
More than one in five of these images were classified as Category A, the most serious kind of imagery, and more than half depicted primary school-aged children. In another survey, 80% of respondents in a dark web paedophile forum admitted they had or intended to use AI tools to create child sexual abuse images.
Some have tried to argue that simulated images could offer a “less harmful” alternative because children are not being assaulted or exploited “in reality”. Yet these images normalise predatory behaviour, create demand for a fundamentally exploitative industry, complicate police investigations and waste precious resources, all while using the faces and likenesses of real children.
Predators use real videos and images of child sex abuse to “train” AI programmes to create more content, or use social media images of children to create “deepfake” lookalikes. In one small town in Spain, over 20 girls aged between 11-17 had AI-generated naked photographs of themselves circulated; they weren’t “real”, but that hardly absolves those responsible.
Without regulation, where does this go next? Amazon, Instagram and Etsy have all been criticised for allowing sellers to advertise sex dolls that approximate the size and features of children. If AI can create child pornography simulations, then it could also use recordings to animate sex dolls or robots with children’s voices or vocabulary.
Yet despite these obvious dangers, “sharenting” (the practice of parents publicising content about their children on social media) is more popular than ever. Most babies now make their digital debut within an hour of birth; parents then go on to post an average of 1500 images of their children on social media before they even go to primary school.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
SubscribeI can’t believe that until today Youtube has no solution for the children inappropriate content. It has one unprotected switch in the menu and a very narrow definition of adult material which is apparently screened and removed. Otherwise the huge variety of adult stuff is readily available. The protection is easy to implement both technically and legally. Not into conspiracies but it’s hard to believe it’s not intentional at some level.
I wonder why the UK is not doing more to stop child porn on the internet. What’s that all about?
All pornography must tend, must point at all times *towards* the forbidden. Towards the ever elusive locus, in other words of what is ultimately most desirable. Otherwise there would be no point in pornography at all, now would there.
It seems to me if you cannot grasp that much at least, then why even try to exercise thought on such things?
Clare, I believe this conversation is about child porn fictional literature. That seems very similar to murder-story fiction. One is criminal to read, and one is not, and we are trying to understand why.
I think you’ve misunderstood what the article is about. It’s not about ‘child porn fictional literature’ (as in words and writing), it’s about actual images (as in a picture) generated by artificial intelligence that have the potential to look nearly indistinguishable from photos taken on a camera. The title is: Predators are exploiting AI for child abuse images. This has nothing to do with reading, unless you’re going to reach new levels of pedantry and argue that we read images.
The reasons given in the article why drawings should be criminal seem equally applicable to much fictional literature: they “normalise predatory behaviour, create demand for a fundamentally exploitative industry, complicate police investigations and waste precious resources”.
Except fictional literature isn’t being used as a substitute for child abuse content…
Except fictional literature isn’t being used as an actual substitute for child abuse content… what books are you reading??? Unless I’m interpreting you wrong.
A who-dunnit could be considered to normalise murder etc.
No, not the same. Looking lasciviously at pictures of naked children is not the same as reading an Agatha Christie novel.
I’d say reading about murder is about as serious as any ‘reading crime’ can be.
Except ‘reading crimes’ don’t exist. Crimes related to images of the abuse of children do.
The reading of child porn seems to me to be a crime in which the only act required for conviction, is reading. So I think ‘reading crime’ is an accurate description.
Exactly!!
Oh please… your aunt reading a murder mystery novel is not the same as a paedophile looking at AI-generated child abuse images. These are totally false equivalents. People don’t read murder mysteries as a stand-in for actually killing someone, nor is reading about real murders a criminal offence. I don’t see how it ‘normalises’ murder either, considering that people who read these sorts of things usually agree that murder is wrong. The whole point of these AI child abuse images is so that paedos can use them as a stand-in for looking at the real thing, and looking at the real thing is a criminal offence.
Oh please… someone reading a murder mystery is not the same as looking at AI generated images of child abuse in any way. These are totally false equivalents. When people read murder mysteries they’re not reading them as a stand-in for their desire to kill, and reading about real murders isn’t illegal. It doesn’t normalise murder because the vast majority of people who read them would agree that murder is wrong anyway. Looking at abuse of children is illegal, and AI generated images acts as a stand-in for paedophiles for the real thing.
I do know the facts of the law, I am asking ‘why’? Why reading about one fictional atrocity (eg child porn) is illegal but reading about another fictional atrocity (eg murder, even worse, in my opinion) is not. So far, I have seen no logical explanation.
Indeed. For there is no logical explanation. There are only fellows like Locke seizing the opportunity to use trusty old expressions like “false equivalents”, feeling the need moreover to boost them with a “totally”, just to, you know, make sure..
There are logical explanations aplenty, you are just purposefully contrarian and self-satisfied and won’t listen to them. Offer me an argument that it is not a false equivalent. Enlighten me, in good-faith, of your theory.
Methinks you’re being pedantic for sport.
Clare, I know I’m speaking against conventional opinion, but I’m completely serious. I’m still waiting for a logical refutation of my argument. I’m beginning to doubt that there is one.
Aren’t crime novels about the fascination with the patho-psychology or moral failings of the murderer and the thrilling component of the capturing of the murderer, not about promoting murder itself? I can’t see how that is equivalent to child-porn romance novel, which I’m assuming the purpose is to titillate. Since child abuse is illegal/immoral, then producing stories for that purpose surely comes under similar moral /legal implications? Its tough though, what to do with people with such predilections? Can they be given “safe” outlets?
Men don’t read Agatha Christie to “get off”!
I’d guess that if men read Agatha Christie’s fiction, it’s because they enjoy doing so. And I’d guess that if men read fictional child porn, its also because they enjoy doing so. But one reading is legal and one is illegal. Why is that?
Because men who read Christie aren’t sexually ‘enjoying’ the simulated abuse of children.
My comment appears to have been denied by the Unherd censors, presumably because i chose to use the word “chump”.
As an American might say – bah humbug!