by Katherine Dee
Thursday, 16
September 2021
Dark Web
07:00

Deepfakes have disturbing implications for porn

A new app digitally superimposes women into videos — this is only the start
by Katherine Dee
Credit: Getty

Artifical Intelligence has been heralded as ‘the future’ for as long as I can remember. The excitement is understandable: scientists may one-day be able to create an AI that is so efficient it can substitute the manual work of humans, replacing doctors, engineers and even journalists. But professional occupations are not the only area AI can be a powerful force.

Just a few weeks ago I speculated that if apps like Replika, which is supposed to simulate both friendships and romantic relationships, became too effective in their job they may replace certain aspects of human interpersonal relationships. AI could provide the same sort of parasocial digital relationship that one may gain through purchasing an OnlyFans membership, while cutting out the human on the other side of the screen altogether.

What else happens when you can credibly replicate somebody’s actions, voice, body, and image? The arrival of a new AI-drive ‘deepfake’ app gives us reason to be apprehensive — in fact, tech publications have gone as far as vowing not to name the application, for fear that it might cause an unintended spike in popularity.

The reason for the concern comes down to the central features of the app: it can create deepfake pornography with just the click of a button. Deepfaking is the process of superimposing the image of one person on top of a video of another, to make it appear as if the target is performing an action they never have. This tech has been around for a while (you might be familiar with the ReFace App, where you can faceswap yourself into gifs), and some attempts at deepfake pornography have been made. But this latest app is by far the easiest to use.

The existence of deepfake pornography is disturbing, and while for the moment it is still mostly possible to differentiate between what is fake and what is real, researchers fear that we will one day reach a point where the AI is so sophisticated it will be functionally impossible to see if digital content is legitimate.

The venture capitalist Anirudh Pai gives his vision of the post-deepfake takeover internet, stating:

All online content will eventually be fake because of deepfakes. The technology cabal isn’t strong enough to monitor the content on their own platforms, much less that of the entire internet. Perhaps the only medium to fix this is a truth-first internet, underpinned by the blockchain, where people will discern what is real through proof of work. Secondly, the public internet will die and be split up into discrete parts of exclusive information silos where people can police the quality of the information they receive.
- Anirudh Pai

But these private servers are walled gardens, not everyone gets invited. So what happens? As the internet contorts under its own weight, we become desensitised, disinterested. Maybe we’ll decide just to log off, and watch as a new era of Luddism begins.

Join the discussion


  • Not sure that you can legitimately describe that choice as ‘Luddism’, with the pejorative overtones that word brings to the debate. If the tool/mechanism isn’t actually working (and the article is surely making that forecast for the Internet), then you throw it away and find another tool that will. That’s a progressive response, not Luddism. But in any case, what a mess this all is.

  • Deep faking can not be stopped but at least the better it is the less anyone will believe any video is “real” and the easier it will be to brush it off as fake. Invasions into privacy and revenge porn will be less revalatory and therefore should be less painful because others will not react in a way that makes you feel ashamed. It might even lead to people realising that how they feel about themselves is more important to their well being than how they think other people regard them.

  • Maybe this is why ancient Law forbade idolotry, which is undue reverence to imagery.
    However this tech metastasizes, its manipulative sorcery indicates a definite need for returning to text, or even script, for any humans who value truth and justice,

  • To get involved in the discussion and stay up to date, become a registered user.

    It's simple, quick and free.

    Sign me up