Artifical Intelligence has been heralded as ‘the future’ for as long as I can remember. The excitement is understandable: scientists may one-day be able to create an AI that is so efficient it can substitute the manual work of humans, replacing doctors, engineers and even journalists. But professional occupations are not the only area AI can be a powerful force.
Just a few weeks ago I speculated that if apps like Replika, which is supposed to simulate both friendships and romantic relationships, became too effective in their job they may replace certain aspects of human interpersonal relationships. AI could provide the same sort of parasocial digital relationship that one may gain through purchasing an OnlyFans membership, while cutting out the human on the other side of the screen altogether.
What else happens when you can credibly replicate somebody’s actions, voice, body, and image? The arrival of a new AI-drive ‘deepfake’ app gives us reason to be apprehensive — in fact, tech publications have gone as far as vowing not to name the application, for fear that it might cause an unintended spike in popularity.
The reason for the concern comes down to the central features of the app: it can create deepfake pornography with just the click of a button. Deepfaking is the process of superimposing the image of one person on top of a video of another, to make it appear as if the target is performing an action they never have. This tech has been around for a while (you might be familiar with the ReFace App, where you can faceswap yourself into gifs), and some attempts at deepfake pornography have been made. But this latest app is by far the easiest to use.
The existence of deepfake pornography is disturbing, and while for the moment it is still mostly possible to differentiate between what is fake and what is real, researchers fear that we will one day reach a point where the AI is so sophisticated it will be functionally impossible to see if digital content is legitimate.
The venture capitalist Anirudh Pai gives his vision of the post-deepfake takeover internet, stating:
But these private servers are walled gardens, not everyone gets invited. So what happens? As the internet contorts under its own weight, we become desensitised, disinterested. Maybe we’ll decide just to log off, and watch as a new era of Luddism begins.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
SubscribeNo, it digitally superimposes people into videos.
Whilst being falsely inserted into porn is no doubt horribly embarrassing and possibly career-threatening, the bigger and far more dangerous issue is the possibility of being inserted into CCTV footage of crime.
“Liberty and justice threatened, women most affected” makes us look like self-absorbed fools.
Maybe this is why ancient Law forbade idolotry, which is undue reverence to imagery.
However this tech metastasizes, its manipulative sorcery indicates a definite need for returning to text, or even script, for any humans who value truth and justice,
Not sure that you can legitimately describe that choice as ‘Luddism’, with the pejorative overtones that word brings to the debate. If the tool/mechanism isn’t actually working (and the article is surely making that forecast for the Internet), then you throw it away and find another tool that will. That’s a progressive response, not Luddism. But in any case, what a mess this all is.
Deep faking can not be stopped but at least the better it is the less anyone will believe any video is “real” and the easier it will be to brush it off as fake. Invasions into privacy and revenge porn will be less revalatory and therefore should be less painful because others will not react in a way that makes you feel ashamed. It might even lead to people realising that how they feel about themselves is more important to their well being than how they think other people regard them.
no point in commenting, Unherd just deletes my posts…
All digital imagery will be regarded as fake unless confirmed by other physical evidence and or witness statements. T’internet will die, hurrah.