Subscribe
Notify of
guest

41 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Adrian
Adrian
3 years ago

On the flip-side, anyone who’s had a real nude get out into the wild will now be heaving a sigh of relief. It can all be explained away now see.

kingdomoflindsey
kingdomoflindsey
3 years ago
Reply to  Adrian

Sounds like you might have a (di)vested personal interest in this!

Adrian
Adrian
3 years ago

They must have trained their AI model on pictures of button mushrooms. Cold button mushrooms.

Swag Valance
Swag Valance
3 years ago
Reply to  Adrian

It’s the equivalent of the Twitter tweet gaff response, “My account was hacked.”

malcolm.rose
malcolm.rose
3 years ago
Reply to  Adrian

Excellent point!

Kiran Grimm
Kiran Grimm
3 years ago

Another alarming report on the rapidly increasing power of computers. Doubtless, the cyber-advocates will describe this as the democratisation of skills. Now anyone can utilise low-cost high-tec to produce convincing fake imagery and sound on their own pet gadgetry.

This disturbing information should be broadcast loud and long in the MSM ““ if they ever get tired of telling us conscience-pricking stories about the loss of free-school meals in the school holidays.

Christopher Barclay
Christopher Barclay
3 years ago

Since before celluloid was invented, men and women have fantasised about having sex with their objects of desire. This technology turns that fantasy into film. The next step would be for an app to also take a photo of the app user and to create a pornographic film of the user and his or her object of desire having sex. It is easy to see why the unwilling party would find this alarming, especially the possibility that the app user will want to move on from a virtual act to actual rape. However, it would probably be easy to prove that the body in the film is not you due to the non-appearance of birthmarks, moles etc. More concerning to me is the ability of the state and other organisations to create synthetic film of ‘crimes’ being committed by fully dressed actors and to convict us of crimes that we didn’t commit.

Juilan Bonmottier
Juilan Bonmottier
3 years ago

this is kind of terrifying!

Nunya Bizniss
Nunya Bizniss
3 years ago

This is also my concern. Chinese-style social credit systems, determining whether you are “good” or “bad” (see Gavin Haynes piece on 29 Oct for evidence that Banks are already moving in this direction) and deepfakes used to condemn and convict anyone deemed to be the latter. And why not heinous crimes, in order to warrant extreme punishments?

ard10027
ard10027
3 years ago

Devil’s advocate time. If you can create completely convincing — but entirely fake — pornographic images, without using real models or exploiting anyone, particularly children, then, BY LIBERAL LIGHTS, what’s the problem? Again, specifically, I’m asking what the problem is BY LIBERAL LIGHTS. As a Christian, I can see a major problem with it, but I’m asking people who consider themselves enlightened and modern, people who think with the “it’s the current year” mentality to explain what problem THEY have with others panting over images of people who don’t exist. After all, it’s all about freedom of the individual, isn’t it?

Alex Lekas
Alex Lekas
3 years ago
Reply to  ard10027

if we use the traditional definition of liberal, I don’t see much support for kiddie pics or for exploitation. Perhaps that is the problem. Today’s left is decidedly illiberal.

Alex Lekas
Alex Lekas
3 years ago
Reply to  ard10027

if we use the traditional definition of liberal, I don’t see much support for kiddie pics or for exploitation. Perhaps that is the problem. Today’s left is decidedly illiberal.

M Spahn
M Spahn
3 years ago
Reply to  ard10027

If completely fake images could keep the pedos out of trouble, I wouldn’t see an issue with it. Have at it, creeps. However as I understand it, such things whet rather than sate their appetites. So that is the issue.

Charles Rense
Charles Rense
3 years ago

Spread this simple credo:

The internet is full of lies
The internet is full of misinformation
The internet is not to be trusted at face value
SO USE YOUR GODDAMN BRAIN!

Geoff Cox
Geoff Cox
3 years ago

A sobering article! But people should also know that there are companies (now in start-up) that are providing software to ascertain whether a image or sound has been tampered with or is an outright fake. Naturally with all these things, the criminals get first-mover advantage. But industry is fighting back. I predict one of the first industries to adopt a new standard will be insurance where they currently often rely on a photo to ascertain damage – especially at low level where it is not worth sending out a loss adjuster.

Mark Corby
Mark Corby
3 years ago

The ‘Ancients’ would die laughing at this piece, and the neurotic nonsense about nudity it so clearly represents.

The Ancient Olympic Games were held entirely gymnos/naked, bar for one event.

What are we so afraid of?

Gerald gwarcuri
Gerald gwarcuri
3 years ago
Reply to  Mark Corby

You missed the point. This article about AI is not about nudity at all. It is about the potential for the de-humanizing of individual people by artificially manipulating their personal attributes and information,

Mark Corby
Mark Corby
3 years ago

I beg to disagree, most have already been de-humanised and nothing can be done to save them.

Judy Johnson
Judy Johnson
3 years ago
Reply to  Mark Corby

I disagree with your comment because there are some areas where we should never give up.

Juilan Bonmottier
Juilan Bonmottier
3 years ago

Is it really dehumanizing if it is all artificial? I think one of the issues is where someone’s ‘brand’ or image is mistaken to be their real self. People seem to really struggle in the understanding of this -I can see why, because those who make brands of themselves present their image as the authentic self (except when things go wrong when they try to get as far away from it as possible) -but we know this is never ever true. The answer is simple (perhaps) -don’t make yourself into a brand and don’t ask others to buy into your image. I guess try to be honest and truthful in all your representations. Don’t be a narcissist I guess.

Adrian
Adrian
3 years ago
Reply to  Mark Corby

Yup, what a bummer when you the ancient greek olympic lottery allocates you tickets for the ancient beach volleyball, and it’s the only one with clothes on.

Andrew D
Andrew D
3 years ago
Reply to  Mark Corby

I think the Winter Olympics should be held entirely naked. Now that would be Spartan!

Mark Corby
Mark Corby
3 years ago
Reply to  Andrew D

What an excellent idea, well said!

Rob Grayson
Rob Grayson
3 years ago

“It is no exaggeration to say that soon almost everything we see or hear online will be synthetic.”

Um, yes it is.

Adrian
Adrian
3 years ago
Reply to  Rob Grayson

But then you would say that.

Rob…… Ot?

M Spahn
M Spahn
3 years ago

Interesting. I remember a flurry of articles about this two or three years ago, and then the subject fell off the map. Apparently it is because our new wokehadi media masters prefer to go dark on topics they don’t think the rabble is mature enough to be apprised of. In any case, another reason to be grateful Unherd is around.

Swag Valance
Swag Valance
3 years ago

The camera has always lied. Finally now we have to face up to that fact.

Charles Rense
Charles Rense
3 years ago
Reply to  Swag Valance

I went to the UK once on vacation, but I forgot to bring my camera. I went to a camera store in London and picked one out, but they wanted to charge me ten pounds more than the tag said!

Stuart McCullough
Stuart McCullough
3 years ago

And to think that my bank was only recently asking me to sign up to their voice recognition system……..

Adrian
Adrian
3 years ago

Yup. Everyone with a passing knowlege of recent advances in AI could tell that wasn’t gonna stop the hordes at the gates for more than a week.

Gerald gwarcuri
Gerald gwarcuri
3 years ago

Identity is everything. How can it be secured? The Mark of the Beast. Without which, no one will be able to buy or sell. Get ready…

Malcolm Ripley
Malcolm Ripley
3 years ago

The solution is to blockchain everything. Even if the fake was put on the blockchain as if it was an original the true original photos/videos will precede it by time.You could also blockchain your calendar/phone/satnav coordinates and thus prove where you really were. This does not stop someone from creating a fake but it provides the incontrovertible truth to send the faker to jail. Plus all who handle the fake (they can confirm the truth by referencing the block chain).

Swag Valance
Swag Valance
3 years ago
Reply to  Malcolm Ripley

No, it isn’t. Let’s not melt the polar ice cap any faster just because you have only a single solution in mind.

Peter Kriens
Peter Kriens
3 years ago

Losing your child or partner is horror. Losing a body part at a work accident is horror. I can imagine being ‘complimented’ with a deep fake video is rather unpleasant but can you seriously call it ‘horror’?

Lindsay Gatward
Lindsay Gatward
3 years ago

Surely we will have to just live with this – If the tech is going to allow creation of a perfect image of whatever turns you on and delete and create it at will then thought police will be the only route but why bother when no one other than the imaginer was involved so where is the crime – Interesting if the tech allowed direct imaging into the brain without even a screen – There may be a virtue signal opposition to male satisfaction with the purely visual – Mind you that would be solved with androids or hard light holograms with mobile emitters – Of course this leads to no babies and therefore no people or can androids have babies as well as dreams.

ralph bell
ralph bell
3 years ago

Fascinating article. I guess could a form of copyright on individuals be a way to protect our identity being used/misused and allow people to give specific permissions to profit form it if they wished. Any without an agreed copyright would be taken down and prosecuted? A digital genetic fingerprint? By the way I am no expert.

Jeff Cunningham
Jeff Cunningham
1 year ago
Reply to  ralph bell

I don’t understand why you were downvoted. I’m thinking along similar lines that tort law could be the solution. It’s a form of libel, isn’t it? Broadly speaking? Eliminate the protections that platforms hosting this stuff enjoy and see how fast they clean it up or get sued out of existence.

kingdomoflindsey
kingdomoflindsey
3 years ago

An engrossing, and important, article. In a complementary article for the new webzine The Brazen Head, Robert Henderson writes about AI’s implications for employment – https://brazen-head.org/202

brianlyn
brianlyn
3 years ago

Require everyone to have an online passport.

Greg Eiden
Greg Eiden
3 years ago

Want to stop this? Start cutting off the hands of the few perps you can find. Publicly. One thing, the only thing, the Mohammedans got right…cutting off the hands of thieves. Deep fake extortionists are at least thieves.

And use a scimitar. If the ax man isn’t very good, you could get lucky and just lose fingers…or maybe the whole forearm. Luck of the draw! The chance you take. After 10 such exhibitions, I’m guessing the incidents will drop a fair bit. If not? Keep cutting!

Bengt Dhover
Bengt Dhover
3 years ago
Reply to  Greg Eiden

The same principle, with a minor but distinct modification, could be applied to deter rapists.