Martin Scorcese’s most recent film, The Irishman, told a story that spanned seven decades. Robert Di Niro and Joe Pesci starred, and in order to “de-age” them, Scorcese used a special three-rig camera and employed dedicated special effects artists for post-production work. The costs ran into the millions — and the results were patchy. Earlier this year, a YouTuber decided to see if he could do any better: using free artificial intelligence software, he bettered Scorcese’s attempt in a week.
It is no exaggeration to say that soon almost everything we see or hear online will be synthetic — that is, generated or manipulated by AI. Machines that can “learn” to do almost anything when ‘trained’ on the right data — and they’ve never had access to more data, nor so much power to churn through it all. Some experts estimate that within 5-7 years, 90% of all video content online will be synthetic. Before long, anyone with a smartphone will be able to make Hollywood level AI-generated content.
One synthetic-text generating model can already generate articles that appear to have been written by a human. AI can be trained to clone someone’s voice even if they’re already dead: an old recording of JFK’s voice has been used to make a clip of the former president reading the Book of Genesis. AI trained on a dataset of human faces can generate convincing fake images of people who do not exist, and it can be taught to insert people into photographs and videos they were not originally in. One YouTuber is working on a project to insert actor Nicholas Cage into every movie ever made.
All this sounds weird at worst and hilarious at best, but this technology has a dark side. It will, inevitably, be misused, and for that most obvious of male-driven reasons. It was reported last week that the messaging app Telegram is hosting a “deepfake pornography bot,” which allows users to generate images of naked women. According to the report, there are already over 100,000 such images being circulated in public Telegram channels; considering that they are almost certainly being shared privately too, the actual number in existence is likely much higher. The women who appear to feature in this cache of publicly-shared fake porn are mostly private individuals rather than celebrities. More disturbingly, the images also include deepfake nudes of underage girls.
Henry Ajder, the lead author of the report, told me that “the discovery of the bot and its surrounding ecosystem was a disturbing, yet sadly unsurprising, confirmation that the creation and sharing of malicious deepfakes is growing rapidly.” When he wrote to Telegram to request a takedown, he “received no response”. The bot and the fake pornography — including that of minors — is still live at time of writing.
Deepfake pornography first emerged less than three years ago, when an anonymous user started posting it on Reddit. He later revealed his methodology: he was using open-source AI software to insert celebrity faces into pornographic films, by training AI on images and videos of his intended target. The end result was surprisingly convincing. When other Redditors started making their own deepfake pornography and news of this AI-assisted community broke to the world, there was a furore: Reddit shut down the community and banned deepfake pornography. But the genie was out of the bottle.
Since its early days on Reddit, an entire deepfake pornography ecosystem has developed online. With a few clicks, it is possible to access deepfake pornography of every (female) celebrity imaginable, from Ivanka Trump and Michelle Obama to Ann Coulter. And celebrities are not the only targets. AI can clone any woman: all that is needed is some training data, and the rapid acceleration of the technology means that less and less training data is required. These days a single picture, a few seconds of a voice recording, or a single video would be enough.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
SubscribeOn the flip-side, anyone who’s had a real nude get out into the wild will now be heaving a sigh of relief. It can all be explained away now see.
Sounds like you might have a (di)vested personal interest in this!
They must have trained their AI model on pictures of button mushrooms. Cold button mushrooms.
It’s the equivalent of the Twitter tweet gaff response, “My account was hacked.”
Excellent point!
Another alarming report on the rapidly increasing power of computers. Doubtless, the cyber-advocates will describe this as the democratisation of skills. Now anyone can utilise low-cost high-tec to produce convincing fake imagery and sound on their own pet gadgetry.
This disturbing information should be broadcast loud and long in the MSM ““ if they ever get tired of telling us conscience-pricking stories about the loss of free-school meals in the school holidays.
Since before celluloid was invented, men and women have fantasised about having sex with their objects of desire. This technology turns that fantasy into film. The next step would be for an app to also take a photo of the app user and to create a pornographic film of the user and his or her object of desire having sex. It is easy to see why the unwilling party would find this alarming, especially the possibility that the app user will want to move on from a virtual act to actual rape. However, it would probably be easy to prove that the body in the film is not you due to the non-appearance of birthmarks, moles etc. More concerning to me is the ability of the state and other organisations to create synthetic film of ‘crimes’ being committed by fully dressed actors and to convict us of crimes that we didn’t commit.
this is kind of terrifying!
This is also my concern. Chinese-style social credit systems, determining whether you are “good” or “bad” (see Gavin Haynes piece on 29 Oct for evidence that Banks are already moving in this direction) and deepfakes used to condemn and convict anyone deemed to be the latter. And why not heinous crimes, in order to warrant extreme punishments?
Devil’s advocate time. If you can create completely convincing — but entirely fake — pornographic images, without using real models or exploiting anyone, particularly children, then, BY LIBERAL LIGHTS, what’s the problem? Again, specifically, I’m asking what the problem is BY LIBERAL LIGHTS. As a Christian, I can see a major problem with it, but I’m asking people who consider themselves enlightened and modern, people who think with the “it’s the current year” mentality to explain what problem THEY have with others panting over images of people who don’t exist. After all, it’s all about freedom of the individual, isn’t it?
if we use the traditional definition of liberal, I don’t see much support for kiddie pics or for exploitation. Perhaps that is the problem. Today’s left is decidedly illiberal.
if we use the traditional definition of liberal, I don’t see much support for kiddie pics or for exploitation. Perhaps that is the problem. Today’s left is decidedly illiberal.
If completely fake images could keep the pedos out of trouble, I wouldn’t see an issue with it. Have at it, creeps. However as I understand it, such things whet rather than sate their appetites. So that is the issue.
Spread this simple credo:
The internet is full of lies
The internet is full of misinformation
The internet is not to be trusted at face value
SO USE YOUR GODDAMN BRAIN!
A sobering article! But people should also know that there are companies (now in start-up) that are providing software to ascertain whether a image or sound has been tampered with or is an outright fake. Naturally with all these things, the criminals get first-mover advantage. But industry is fighting back. I predict one of the first industries to adopt a new standard will be insurance where they currently often rely on a photo to ascertain damage – especially at low level where it is not worth sending out a loss adjuster.
The ‘Ancients’ would die laughing at this piece, and the neurotic nonsense about nudity it so clearly represents.
The Ancient Olympic Games were held entirely gymnos/naked, bar for one event.
What are we so afraid of?
You missed the point. This article about AI is not about nudity at all. It is about the potential for the de-humanizing of individual people by artificially manipulating their personal attributes and information,
I beg to disagree, most have already been de-humanised and nothing can be done to save them.
I disagree with your comment because there are some areas where we should never give up.
Is it really dehumanizing if it is all artificial? I think one of the issues is where someone’s ‘brand’ or image is mistaken to be their real self. People seem to really struggle in the understanding of this -I can see why, because those who make brands of themselves present their image as the authentic self (except when things go wrong when they try to get as far away from it as possible) -but we know this is never ever true. The answer is simple (perhaps) -don’t make yourself into a brand and don’t ask others to buy into your image. I guess try to be honest and truthful in all your representations. Don’t be a narcissist I guess.
Yup, what a bummer when you the ancient greek olympic lottery allocates you tickets for the ancient beach volleyball, and it’s the only one with clothes on.
I think the Winter Olympics should be held entirely naked. Now that would be Spartan!
What an excellent idea, well said!
“It is no exaggeration to say that soon almost everything we see or hear online will be synthetic.”
Um, yes it is.
But then you would say that.
Rob…… Ot?
Interesting. I remember a flurry of articles about this two or three years ago, and then the subject fell off the map. Apparently it is because our new wokehadi media masters prefer to go dark on topics they don’t think the rabble is mature enough to be apprised of. In any case, another reason to be grateful Unherd is around.
The camera has always lied. Finally now we have to face up to that fact.
I went to the UK once on vacation, but I forgot to bring my camera. I went to a camera store in London and picked one out, but they wanted to charge me ten pounds more than the tag said!
And to think that my bank was only recently asking me to sign up to their voice recognition system……..
Yup. Everyone with a passing knowlege of recent advances in AI could tell that wasn’t gonna stop the hordes at the gates for more than a week.
Identity is everything. How can it be secured? The Mark of the Beast. Without which, no one will be able to buy or sell. Get ready…
The solution is to blockchain everything. Even if the fake was put on the blockchain as if it was an original the true original photos/videos will precede it by time.You could also blockchain your calendar/phone/satnav coordinates and thus prove where you really were. This does not stop someone from creating a fake but it provides the incontrovertible truth to send the faker to jail. Plus all who handle the fake (they can confirm the truth by referencing the block chain).
No, it isn’t. Let’s not melt the polar ice cap any faster just because you have only a single solution in mind.
Losing your child or partner is horror. Losing a body part at a work accident is horror. I can imagine being ‘complimented’ with a deep fake video is rather unpleasant but can you seriously call it ‘horror’?
Surely we will have to just live with this – If the tech is going to allow creation of a perfect image of whatever turns you on and delete and create it at will then thought police will be the only route but why bother when no one other than the imaginer was involved so where is the crime – Interesting if the tech allowed direct imaging into the brain without even a screen – There may be a virtue signal opposition to male satisfaction with the purely visual – Mind you that would be solved with androids or hard light holograms with mobile emitters – Of course this leads to no babies and therefore no people or can androids have babies as well as dreams.
Fascinating article. I guess could a form of copyright on individuals be a way to protect our identity being used/misused and allow people to give specific permissions to profit form it if they wished. Any without an agreed copyright would be taken down and prosecuted? A digital genetic fingerprint? By the way I am no expert.
I don’t understand why you were downvoted. I’m thinking along similar lines that tort law could be the solution. It’s a form of libel, isn’t it? Broadly speaking? Eliminate the protections that platforms hosting this stuff enjoy and see how fast they clean it up or get sued out of existence.
An engrossing, and important, article. In a complementary article for the new webzine The Brazen Head, Robert Henderson writes about AI’s implications for employment – https://brazen-head.org/202…
Require everyone to have an online passport.
Want to stop this? Start cutting off the hands of the few perps you can find. Publicly. One thing, the only thing, the Mohammedans got right…cutting off the hands of thieves. Deep fake extortionists are at least thieves.
And use a scimitar. If the ax man isn’t very good, you could get lucky and just lose fingers…or maybe the whole forearm. Luck of the draw! The chance you take. After 10 such exhibitions, I’m guessing the incidents will drop a fair bit. If not? Keep cutting!
The same principle, with a minor but distinct modification, could be applied to deter rapists.