X Close

Predators are exploiting AI for child abuse images

Parents post an average of 1500 images of their children on social media before primary school. Credit: Getty

October 26, 2023 - 4:00pm

A recent investigation by the Internet Watch Foundation has found that paedophiles are using AI to create images of celebrities as children, as well as manipulate photographs of child actors and generate scores of images of real victims of child sex abuse. The report details how 11,000 AI images were shared on a single darknet child abuse website, 3000 of which would be illegal under UK law. 

More than one in five of these images were classified as Category A, the most serious kind of imagery, and more than half depicted primary school-aged children. In another survey, 80% of respondents in a dark web paedophile forum admitted they had or intended to use AI tools to create child sexual abuse images.

Some have tried to argue that simulated images could offer a “less harmful” alternative because children are not being assaulted or exploited “in reality”. Yet these images normalise predatory behaviour, create demand for a fundamentally exploitative industry, complicate police investigations and waste precious resources, all while using the faces and likenesses of real children.

Predators use real videos and images of child sex abuse to “train” AI programmes to create more content, or use social media images of children to create “deepfake” lookalikes. In one small town in Spain, over 20 girls aged between 11-17 had AI-generated naked photographs of themselves circulated; they weren’t “real”, but that hardly absolves those responsible. 

Without regulation, where does this go next? Amazon, Instagram and Etsy have all been criticised for allowing sellers to advertise sex dolls that approximate the size and features of children. If AI can create child pornography simulations, then it could also use recordings to animate sex dolls or robots with children’s voices or vocabulary.

Yet despite these obvious dangers, “sharenting” (the practice of parents publicising content about their children on social media) is more popular than ever. Most babies now make their digital debut within an hour of birth; parents then go on to post an average of 1500 images of their children on social media before they even go to primary school. 

Around one in four parents have a public profile, meaning anyone is allowed to see their posts, while 80% of parents admit to not knowing all their social media friends or having followers who they have never met face to face. For years celebrities have blurred out images of their children’s faces in paparazzi shots because of safety fears, and yet now everyone from “mummy influencers” with millions of followers to regular people with a few dozen online friends share the intimate ins and outs of their children’s lives with impunity.

Once again, the UK’s passivity on this issue is notable. Many US states have made huge strides in terms of cracking down on minors accessing pornography; France has introduced age verification for social media sites and stricter parental controls; while the EU has forced TikTok to make its “For You” algorithm optional and banned adverts targeted at 11-18 year olds. The UK has done none of these things. France has even introduced a bill banning parents from sharing children’s photographs on social media, citing the fact that half of all pictures exchanged on paedophile forums originate from photographs posted by families on these platforms. The UK is unlikely to follow suit anytime soon.

This is a bold move, but we cannot rely on Big Tech being able to moderate its own content (we already know, for example, that Instagram fails to remove accounts that have been flagged for posting sexualised content of children). Until AI is regulated or real legislation brought in, the only way to protect children from facial recognition, profiling, data mining, the loss of their anonymity, and being potentially turned into a pornographic avatar, is to simply stop posting about them.


Kristina Murkett is a freelance writer and English teacher.

kristinamurkett

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

26 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
G K
G K
1 year ago

I can’t believe that until today Youtube has no solution for the children inappropriate content. It has one unprotected switch in the menu and a very narrow definition of adult material which is apparently screened and removed. Otherwise the huge variety of adult stuff is readily available. The protection is easy to implement both technically and legally. Not into conspiracies but it’s hard to believe it’s not intentional at some level.

Clare Knight
Clare Knight
1 year ago

I wonder why the UK is not doing more to stop child porn on the internet. What’s that all about?

Don Lightband
Don Lightband
1 year ago
Reply to  Clare Knight

All pornography must tend, must point at all times *towards* the forbidden. Towards the ever elusive locus, in other words of what is ultimately most desirable. Otherwise there would be no point in pornography at all, now would there.

It seems to me if you cannot grasp that much at least, then why even try to exercise thought on such things?

Will K
Will K
1 year ago
Reply to  Clare Knight

Clare, I believe this conversation is about child porn fictional literature. That seems very similar to murder-story fiction. One is criminal to read, and one is not, and we are trying to understand why.

George Locke
George Locke
1 year ago
Reply to  Will K

I think you’ve misunderstood what the article is about. It’s not about ‘child porn fictional literature’ (as in words and writing), it’s about actual images (as in a picture) generated by artificial intelligence that have the potential to look nearly indistinguishable from photos taken on a camera. The title is: Predators are exploiting AI for child abuse images. This has nothing to do with reading, unless you’re going to reach new levels of pedantry and argue that we read images.

Last edited 1 year ago by George Locke
Will K
Will K
1 year ago

The reasons given in the article why drawings should be criminal seem equally applicable to much fictional literature: they “normalise predatory behaviour, create demand for a fundamentally exploitative industry, complicate police investigations and waste precious resources”.

George Locke
George Locke
1 year ago
Reply to  Will K

Except fictional literature isn’t being used as a substitute for child abuse content…

George Locke
George Locke
1 year ago
Reply to  Will K

Except fictional literature isn’t being used as an actual substitute for child abuse content… what books are you reading??? Unless I’m interpreting you wrong.

Last edited 1 year ago by George Locke
Will K
Will K
1 year ago
Reply to  George Locke

A who-dunnit could be considered to normalise murder etc.

Julian Farrows
Julian Farrows
1 year ago
Reply to  Will K

No, not the same. Looking lasciviously at pictures of naked children is not the same as reading an Agatha Christie novel.

Will K
Will K
1 year ago
Reply to  Julian Farrows

I’d say reading about murder is about as serious as any ‘reading crime’ can be.

George Locke
George Locke
1 year ago
Reply to  Will K

Except ‘reading crimes’ don’t exist. Crimes related to images of the abuse of children do.

Will K
Will K
1 year ago
Reply to  George Locke

The reading of child porn seems to me to be a crime in which the only act required for conviction, is reading. So I think ‘reading crime’ is an accurate description.

Clare Knight
Clare Knight
1 year ago
Reply to  Julian Farrows

Exactly!!

George Locke
George Locke
1 year ago
Reply to  Will K

Oh please… your aunt reading a murder mystery novel is not the same as a paedophile looking at AI-generated child abuse images. These are totally false equivalents. People don’t read murder mysteries as a stand-in for actually killing someone, nor is reading about real murders a criminal offence. I don’t see how it ‘normalises’ murder either, considering that people who read these sorts of things usually agree that murder is wrong. The whole point of these AI child abuse images is so that paedos can use them as a stand-in for looking at the real thing, and looking at the real thing is a criminal offence.

Last edited 1 year ago by George Locke
George Locke
George Locke
1 year ago
Reply to  Will K

Oh please… someone reading a murder mystery is not the same as looking at AI generated images of child abuse in any way. These are totally false equivalents. When people read murder mysteries they’re not reading them as a stand-in for their desire to kill, and reading about real murders isn’t illegal. It doesn’t normalise murder because the vast majority of people who read them would agree that murder is wrong anyway. Looking at abuse of children is illegal, and AI generated images acts as a stand-in for paedophiles for the real thing.

Will K
Will K
1 year ago
Reply to  George Locke

I do know the facts of the law, I am asking ‘why’? Why reading about one fictional atrocity (eg child porn) is illegal but reading about another fictional atrocity (eg murder, even worse, in my opinion) is not. So far, I have seen no logical explanation.

Last edited 1 year ago by Will K
Don Lightband
Don Lightband
1 year ago
Reply to  Will K

Indeed. For there is no logical explanation. There are only fellows like Locke seizing the opportunity to use trusty old expressions like “false equivalents”, feeling the need moreover to boost them with a “totally”, just to, you know, make sure..

Last edited 1 year ago by Don Lightband
George Locke
George Locke
1 year ago
Reply to  Don Lightband

There are logical explanations aplenty, you are just purposefully contrarian and self-satisfied and won’t listen to them. Offer me an argument that it is not a false equivalent. Enlighten me, in good-faith, of your theory.

Last edited 1 year ago by George Locke
Clare Knight
Clare Knight
1 year ago
Reply to  Will K

Methinks you’re being pedantic for sport.

Last edited 1 year ago by Clare Knight
Will K
Will K
1 year ago
Reply to  Clare Knight

Clare, I know I’m speaking against conventional opinion, but I’m completely serious. I’m still waiting for a logical refutation of my argument. I’m beginning to doubt that there is one.

UnHerd Reader
UnHerd Reader
1 year ago
Reply to  Will K

Aren’t crime novels about the fascination with the patho-psychology or moral failings of the murderer and the thrilling component of the capturing of the murderer, not about promoting murder itself? I can’t see how that is equivalent to child-porn romance novel, which I’m assuming the purpose is to titillate. Since child abuse is illegal/immoral, then producing stories for that purpose surely comes under similar moral /legal implications? Its tough though, what to do with people with such predilections? Can they be given “safe” outlets?

Clare Knight
Clare Knight
1 year ago
Reply to  George Locke

Men don’t read Agatha Christie to “get off”!

Last edited 1 year ago by Clare Knight
Will K
Will K
1 year ago
Reply to  Clare Knight

I’d guess that if men read Agatha Christie’s fiction, it’s because they enjoy doing so. And I’d guess that if men read fictional child porn, its also because they enjoy doing so. But one reading is legal and one is illegal. Why is that?

George Locke
George Locke
1 year ago
Reply to  Will K

Because men who read Christie aren’t sexually ‘enjoying’ the simulated abuse of children.

Don Lightband
Don Lightband
1 year ago

My comment appears to have been denied by the Unherd censors, presumably because i chose to use the word “chump”.

As an American might say – bah humbug!

Last edited 1 year ago by Don Lightband