X Close

Deepfakes have disturbing implications for porn

September 16, 2021 - 7:00am

Artifical Intelligence has been heralded as ‘the future’ for as long as I can remember. The excitement is understandable: scientists may one-day be able to create an AI that is so efficient it can substitute the manual work of humans, replacing doctors, engineers and even journalists. But professional occupations are not the only area AI can be a powerful force.

Just a few weeks ago I speculated that if apps like Replika, which is supposed to simulate both friendships and romantic relationships, became too effective in their job they may replace certain aspects of human interpersonal relationships. AI could provide the same sort of parasocial digital relationship that one may gain through purchasing an OnlyFans membership, while cutting out the human on the other side of the screen altogether.

What else happens when you can credibly replicate somebody’s actions, voice, body, and image? The arrival of a new AI-drive ‘deepfake’ app gives us reason to be apprehensive — in fact, tech publications have gone as far as vowing not to name the application, for fear that it might cause an unintended spike in popularity.

The reason for the concern comes down to the central features of the app: it can create deepfake pornography with just the click of a button. Deepfaking is the process of superimposing the image of one person on top of a video of another, to make it appear as if the target is performing an action they never have. This tech has been around for a while (you might be familiar with the ReFace App, where you can faceswap yourself into gifs), and some attempts at deepfake pornography have been made. But this latest app is by far the easiest to use.

The existence of deepfake pornography is disturbing, and while for the moment it is still mostly possible to differentiate between what is fake and what is real, researchers fear that we will one day reach a point where the AI is so sophisticated it will be functionally impossible to see if digital content is legitimate.

The venture capitalist Anirudh Pai gives his vision of the post-deepfake takeover internet, stating:

All online content will eventually be fake because of deepfakes. The technology cabal isn’t strong enough to monitor the content on their own platforms, much less that of the entire internet. Perhaps the only medium to fix this is a truth-first internet, underpinned by the blockchain, where people will discern what is real through proof of work. Secondly, the public internet will die and be split up into discrete parts of exclusive information silos where people can police the quality of the information they receive.
- Anirudh Pai

But these private servers are walled gardens, not everyone gets invited. So what happens? As the internet contorts under its own weight, we become desensitised, disinterested. Maybe we’ll decide just to log off, and watch as a new era of Luddism begins.


Katherine Dee is a writer. To read more of her work, visit defaultfriend.substack.com.

default_friend

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

6 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Sharon Overy
Sharon Overy
2 years ago

A new app digitally superimposes women into videos…

No, it digitally superimposes people into videos.

Whilst being falsely inserted into porn is no doubt horribly embarrassing and possibly career-threatening, the bigger and far more dangerous issue is the possibility of being inserted into CCTV footage of crime.

“Liberty and justice threatened, women most affected” makes us look like self-absorbed fools.

LCarey Rowland
LCarey Rowland
2 years ago

Maybe this is why ancient Law forbade idolotry, which is undue reverence to imagery.
However this tech metastasizes, its manipulative sorcery indicates a definite need for returning to text, or even script, for any humans who value truth and justice,

Andrew McDonald
Andrew McDonald
2 years ago

Not sure that you can legitimately describe that choice as ‘Luddism’, with the pejorative overtones that word brings to the debate. If the tool/mechanism isn’t actually working (and the article is surely making that forecast for the Internet), then you throw it away and find another tool that will. That’s a progressive response, not Luddism. But in any case, what a mess this all is.

Jon Hawksley
Jon Hawksley
2 years ago

Deep faking can not be stopped but at least the better it is the less anyone will believe any video is “real” and the easier it will be to brush it off as fake. Invasions into privacy and revenge porn will be less revalatory and therefore should be less painful because others will not react in a way that makes you feel ashamed. It might even lead to people realising that how they feel about themselves is more important to their well being than how they think other people regard them.

Galeti Tavas
Galeti Tavas
2 years ago

no point in commenting, Unherd just deletes my posts…

Martin Smith
Martin Smith
2 years ago

All digital imagery will be regarded as fake unless confirmed by other physical evidence and or witness statements. T’internet will die, hurrah.