Scarlett Johansson is one of hundreds of celebrities who've been undressed by deepfake technology. Credit: Laurent KOFFEL/Gamma-Rapho via Getty Images)

It would be naive to assume that, because you’ve never performed in a porn video, you will never appear in one. In fact, these days, anyone with access to an image of your face can, in a matter of seconds produce an extremely convincing video in which you appear as a porn star.
One man who has done this countless times, to countless women, without their consent, is one Mr Deepfakes. As the founder of the most prominent deepfake porn website in existence, he has chosen to remain anonymous. He built the burgeoning community âfrom scratchâ as a side hustle, after deepfake porn was banned from Reddit in 2018. It is, he claims, a place where âusers affected by the ban could goâ, to ensure the technology wasnât unfairly âsquashedâ. MDF, as he styles himself, cites a commitment to free speech, and a desire to advance machine learning, as his sole motivations.
But in Deepfake Porn: Could you be next? MDF comes across as a man struggling with âthe more moral aspectâ of his work. On the one hand, he ardently claims to respect women (â100%â!) and insists this principle is reconcilable with his passion project. On the other, he has no plans to tell his wife â who would âprobably be againstâ deepfake porn, âto put it bluntlyâ â about his work. âIâm afraid of how it would affect her, knowing I work on something like this.â He admits that âthe content is actually in a grey area, and I think weâre on a fine lineâ.
Despite this ethical âindecisionâ, MDFâs website is thriving. It has amassed more than 20,000 deepfake videos of women loosely defined as âcelebritiesâ, who are divided into 23 categories that include âCosplayâ, âThreesomeâ and âAsian Celebâ. Each day, an average of 25 new videos are added by a team of deepfake porn producers. There are 13 million original visitors who view this content every month, 10,000 of whom are online at any given moment.
In these videos, the facial expressions, mannerisms, and idiosyncrasies of the deepfaked subject do not belong to the victim whose face we see. They are the creation of a male fantasy. It not only looks like the victim is performing porn, when they never have, but also that they are engaging in the producerâs favourite sexual acts. Their identity is hijacked. When a victim sees herself embodied in the form of a porn performer, she describes it to me as a feeling of extreme disassociation. That is her face, but that is not her body.
Mr Deepfakes rakes in a high four-figure profit per month, mostly from ad revenue. Heâs probably earning upwards of $100,000 a year from the venture. This money goes mostly to âmaintaining the serversâ.
Most of his effort, meanwhile, goes to imposing strict ethics on his website. Or so he claims. MDF recites his boundaries to minimise harm like a script: no porn niches that are âdefamatoryâ; only âcelebritiesâ allowed; the age limit is absolute; and producers must âmake sure people know that these are fake videosâ by ensuring âevery video is watermarkedâ. But ultimately, MDFâs defence always comes down to separating the videosâ potential consequences from their creators. âWe’re not all bad people!â
MDF claims he is simply an advocate of technological progress. He is, he says, facilitating the improvement of AI, while âother communities like Reddit wouldnât allow itâ. It just so happens that the best way to do this is through the âporn nicheâ: a convenient assumption that MDF treats as an awkward but inevitable fact. The âcommunityâ forum, the section of the website where content creators discuss how to improve results, is therefore what he cares about most. It fails to generate any revenue, at least for now. But this is hardly a charitable venture. This âcommunityâ is working to develop a tool thatâs being used as a weapon against women.
And the ethical boundaries imposed by Mr Deepfakes fall apart on close inspection. There is no clear definition, for instance, of a âcelebrityâ. It includes women in the âmainstream mediaâ, like those who appear in âHollywood filmsâ, but also âsocial media influencersâ. And of course, âpoliticians are in the public domainâ, so they can be targeted too. It is, in short, any woman with a public life. Many of these female âcelebritiesâ already receive so much sexualised publicity in the media that MDFâs community think they are âfair gameâ. Or in the words of one user on the forum: âIf you plaster your face everywhere and thirst trap me into seeking out your visage, then I’m just gonna deepfake you into porn.â
The majority of Mr Deepfakes community are âprobably menâ, he admits. Male users who are becoming radicalised by getting to âpick and choose aspects of different girlsâ, and digitally manipulate them into performing sexual acts.
Disturbingly, one regular poster on the site admits to deepfaking his co-workers. âWalking into work after having deepfaked these women, it did feel odd, but I just controlled my nerves. I can act like nothing is wrong, no one would suspect a thing.â The video is for his eyes only, he insists; it now exists on his hard drive, which he sees as merely an extension of his imagination. Asked if he would create a custom deepfake of an ordinary woman for someone else, he replied: âFrom a moral standpoint, yeah, I don’t think there’s anything that would stop me.â He went on to make a deepfake of someone after a Zoom call.
The forums on MDFâs site seethe with misogyny, illustrating how utterly ineffective his ethical code is. âShe didnât let me smash her during highschool so cool. ima just deepfake u on porn and masturbate to it? CHECKMATE HOE,â one user writes. âIt is never better when deepfake kicks in, and you get to make your dream celebrity be a mindless robot, and obey masters orders,â posted another.
Even those who have built a living on the porn industry are alarmed by the havoc deepfake technology could wreak on womenâs lives. âThe principal concern I have is how quickly the technology is evolving. It’s like a runaway freight train,â warns Viktor Zafirovski, a reviewer at the worldâs largest porn directory website. Not only is it increasingly accessible, but they are getting easier to create.We will reach a point when âeveryone will have to be paranoid about sharing their image,â says Viktor. And having made a small fortune developing this latest tool in the misogynistâs arsenal, even the man at the helm of the deepfake porn industry agrees. âIt will be so convincing, that eventually you can’t identify what’s real and whatâs fake,â MDF says. âYou know, it’s scary to me as well.â
Underneath this cloak of concern for women, Mr Deepfakes must know that he empowers misogynists every day. âPorn runs the world right now. And it’s not something that I agree with with,â he says. Regardless, itâs something he leverages, even as he strives to distance himself from the culture he enables.
MDF doesnât like to imagine how his wife would feel âif she found a video of herself on the internet â or maybe even a deepfake porn video of someone she knowsâ. But there are hundreds of women who, if they visited his site, might have the kind of horrifying realisation he wouldnât want his loved ones to experience. The most frustrating thing is, he admits to feeling some discomfort. âI think I need to, you know, look deep down and see what I’m okay with,â he reflects. But his site remains live, and he keeps making money from it.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
SubscribeMr Deepfakes concern to make it clear that the pornvideos are fakes is driven by the obvious desire not to face a libel action.
Deepfakes are perhaps even more concerning in the context of political manipulation where an opponent can be seen to be uttering some unpopular view that he does not actually hold. Just as I automatically put the phone down when some commercial institution is allegedly calling me to warn about my security being compromised so we will have to get used to being sceptical about scam political videos and, of course, some of those videos will be genuine. We are entering a world where we can no longer trust what we see and hear where technology is involved as many men and women have discovered when it comes to internet enabled romance that strips them of their funds.
It is a confidence tricksters paradise. The world of Big Brother that even Orwell did not envisage.
We are already at the place where we canât believe what we see and hear. A world where violent riots are reported as âmostly peaceful protestsâ. Where a terrorist leader is described as an âaustere religious scholarâ. Where a strange, malevolent injection is purported to be a life-saving vaccine. Where a decrepit corruptocrat who falls asleep mid-sentence is touted as the popular and effective leader of the free world.
That we live in a free world.
Agreed. And the quote, âIt will be so convincing, that eventually you canât identify whatâs real and whatâs fake,â is the most concerning part of it all. We already have this situation exist in the the mainstream media. And look at the results.
The guy – or group of guys – whom the author describes as wrestling with indecision about the morality of their choices, must be in the grip of a horrible cognitive dissonance. Nobody sane thinks this is right, and nobody sane thinks that the consequences of deepfakes can be right even if somehow their current usage is justified… especially on considering your example of malicious political usage.
Women seem not to be too concerned, given that they have no problem being nearly naked in public. And Iâm not talking about just the celebrities who flaunt themselves sexually on magazine covers and rags like The Daily Mail, or showing up at gala events in see-through fabrics; thereâs no end of bare buttocks at the local beach. Women of all ages wear second-skin leggings and sports bras to the supermarket. It seems every selfie-taking teenager is posting pouty pics in sexy poses. Ads for condoms, underwear, and other items formerly reserved for private life are on full display absolutely everywhere. Iâm not at all surprised by this latest method of desensitizing us to outrage. Itâs been in our faces for years.
Sadly true.
I hate to be sounding like a supporter of Sharia Law, but what you are saying is true. But then, consider this; If the Taliban is going one extreme then we are only going the other extreme. Why can’t we be just like the people in the 80s or 90s? Sorry, but I still believe that they are the best people in human history! I maybe wrong because I was only a child then!
The 80s were awesome, fun, optimistic, creative, happy and prosperous because of three very important world leaders: Ronald Reagan, Margaret Thatcher, and Pope John Paul. Look whoâs run the globe ever since. The Clintonsâ nefarious doings were largely ignored because he was forced by the Republican majority to moderate. But make no mistake, the decline began with them.
That’s absolute rubbish.
Struggling to see the correlation between individuals using their free will to dress as they please, and a complete stranger taking an individuals pictures to create explicit media in order to satisfy their sexual gratification. Sounds like you have a ‘She was asking for it!’ mentality.
No it’s not the moderates who are saying ‘she was asking for it’ it’s those who see women as movable sex toys who have simply found another way to assert authority over what we should be allowed to do, in the world according to such men. However, what’s empowering about dressing in a way that anyone would even think ‘she’s asking for it’ or is self respect also a thing to be thrown away?
As a young man, even if this technology was available, I cannot imagine doing such a thing. So much of what we see see today as extreme behaviour is technology driven. By that I mean the technology creates the behaviour. Except for this: someone has to create the technology first before itâs sent out there into the world. That person is someone who spends a lot of hours in their room, in front if a computer, interacting with something that has no human aspects to it at all. Who and what are these people? We can see the distortion in human behaviour as a result of this technology, as a result of this person who does not relate to people in a way we regard as healthy. And yet, once itâs out there itâs picked up by others. But who exactly are they? Are they members of the same group as the originator of the technology? Or does the technology create them? Whoever they are theyâre walking around out there pretending to be like us but theyâre not. Theyâre more like technology imitating humans.
It’s not the technology itself that is bad, as someone also spent years in front of a computer to create Excel and Photoshop, but the perverted use of it. As a society, we have collectively chosen to be unencumbered by a common moral compass (moral relativity), so anything goes now. It only goes downhill from here until the revival takes place.
Technology is not causing the behaviour, technology is only “enabling” the behaviour. Dark people have existed since the coming of Homo sapiens because it simply is an untreatable disease, a pathology, and a biology.
We should be concerned about these developments and stay on top of things constantly, but we should not try to control these things using public policy, because that is going to create another layer and another problem.
Like I said before, we should monitor and stay on top, this means that law enforcement authorities assisted by social behaviour experts, engineers and social scientists should continue to watch these developments and should never be complacent!
I fear “social behavior experts”.
Actually I think you are wrong.
People who would never have given a second thought to this kind of think get tempted to take a look and then get hooked. Much like drugs
I was listening to a barrister talking on Radio 5. When asked what had been the biggest change he had witnessed in his 25 years at the bar, he said the number of sexual offences before the court. He said that 25 years ago you would get the occasional sexual offence but now it seems to be the most common prosecution
But isnât it the same with violence? Numbers are right up from even the recent past. Are people hooked on violence? Once seeing it do they then want to take part?
Maybe the rise in violence and sexual assaults is down to 2 things.
1 greater willingness to report
2 greater number of people in our country from a culture where violence and sexual assault is seen as normal.
If they think it’s normal they wouldn’t report it so that contradicts “more willingness to report”. I think the latter is true.
Because sexual offences are now being prosecuted more but still not enough. It’s taken time for people to speak up, look at the boy scouts and priests for starters. Been going on for centuries.
âwe should monitor and stay on top, this means that law enforcement authorities assisted by social behaviour experts, engineers and social scientists should continue to watch these developments and should never be complacent!â
That sounds like an objective without a strategy. âStay on topâ sounds like a good idea. But how? âSocial behaviour expertsâ means what? âWatch these developmentsâ and do what?
Well said.
Interesting idea. People like to say we’re nowhere close to AI which could be called intelligent by human standards. But who says humans will ultimately own these standards? Looking at this whole situation differently, suppose what we are developing now are the AI tools and weapons which will be used to control if not annihilate us at some point in the future by an AI we don’t recognize as human but which disdains to care.
“An AI we don’t recognize as human” or an AI that we do recognize as human but disdains to care.
Even if you are not religious I am sure you would agree, in this case, with:
Matthew 5:27âYou have heard that it was said, âYou shall not commit adultery.â 28 But I tell you that anyone who looks at a woman lustfully has already committed adultery with her in his heart.
Can you imagine how these men will treat the women they meet that they watch in the deep fakes?
It is sick and soul-destroying.
Probably no different than they do now. Jimmy Carter said he’d committed adultery in his heart.
This is impossible to prevent, no matter how much moral outrage is expressed, and it will only become more common. Pretty soon every teenager will have access. New laws wonât, canât, prevent it.
When I hear about these sorts of behaviours the first two questions that come to my mind are:
what sort of mind thinks this up?
why do they think that it’s alright to do this?
Sick minds. There is no point in asking such questions. Humans have always been sick since the time they spawned on this earth!
Iâm afraid Iâm objecting more and more to âhumansâ or âpeopleâ when men are by far the majority of porn consumers, commit 98% of sexual crimes (and of which the victims are 80% female)!
Absolutely, Alison. Saying “people” deflects from the real culprits.
Human minds … for example, the same minds that think up really, really big bombs and then think it’s alright to give them to politicians to drop on big Japanese cities. By comparison, this stuff is just puerile harmless fun.
They don’t think it’s alright, but I’m sure they rationalize it because they make a hundred thousand a year, apparently.
This is not very easy to stop under existing rules, at least when it comes to public domain images — that is, images that are already freely available on the public internet for free use. I mean, you can ban posting of it on your platforms (which is already done), but people are generally allowed to use public domain images (including things you post on sites like Instagram and Twitter), and photos they take themselves in public places, however they wish as long as they are not defamatory or libellous.
What that generally means in the context of deepfakes is that (1) images you take in private places without permission are very problematic and likely illegal under existing laws (which is probably why he has the “celebrity” rule meaning people whose images are already “out there” in the public domain, and the talk about people using images of coworkers taken at work and so on are likely already illegal under existing laws in many places) and (2) fake images and videos created from these public domain images (or personally taken photographs in public places) are problematic if they are not clearly identified as being fake, because otherwise they can potentially defame or libel the person portrayed by “stating” something untrue about them (since they never actually did what the video portrays). That’s why he has the rules he does.
Legislation could be passed specifically banning deepfakes in order to quash the kind of niche site that this article is talking about, and that may be a good thing, but we should be clear that the “mainstream” platforms that have explicit images — like reddit and Twitter — already ban them, as well as the large porn “tube” sites (although, as we know, the porn tube sites have so much posting volume that their bans are often less effective than they should be). So while incremental legislation may be helpful, it isn’t likely to have a massive impact on something that is already a niche phenomenon.
A bigger issue is that even with a ban, it’s very hard to imagine that being enforced in any way that has a significant impact. Much of this is done privately and shared on private websites already because the mainstream doesn’t permit the images. So, likely some greater enforcement will only send the creators to the darkweb or other places, or simply trading their creations among each other privately. Something similar has happened with child pornography, unfortunately, which is why it continues to proliferate regardless of being very illegal, and almost universally loathed. So, even with a deep loathing and an enthusiastic enforcement regime, as we see for child porn, enforcement is difficult. For something like deepfakes involving images someone is placing in the public domain on Instagram? I doubt we would have very substantial enforcement of that, again, provided that the images are being circulated privately and are not being monetized.
So, yes, a troubling development, as are many that are coming with the increased capabilities of machine learning and AI. In the case of this one, I don’t think that new laws (which may make sense to pass regardless) will have much impact at the end of the day given the way this works currently.
Seems like removing the protections from lawsuit that these platforms enjoy would go a long way towards sorting this out.
But, but… this is hardly a novel issue. People and businesses have been selling tweaked images of (mostly) beautiful people for years.
I wonder how much of the furore is generated because it is now somebody else cashing in on those images?
Your response is more disgusting than the issue itself!
I think the furore has been generated because of the intentions behind the falsified images. Simply tweaking a picture to make someone appear ‘more beautiful’ cannot be compared to editing a complete stranger onto a falsified video for sexual gratification or political aims.
ps- lesson1: stay away from the interweb all you sad creatures.
Politicians could be brought down by faked pedophilia video. Who’s to know it’s not real. And the scene is set for blackmailing of all kinds. The sky’s the limit.
So..? This is a major issue worthy of concern and discussion on this medium? Does anyone seriously believe, given the serious threats facing our daily lives, that this warrants a scintilla of thought? eco zealot tedio sandaloids are bad enough…
It is important, because although at the moment it is pathetic perverts giving themselves and their friends something to w*nk over, it could just as easily be someone producing deep fake images of another person, for example, engaged in child abuse, selling drugs or any other crime that one person wanted to frame another for.
And just as easy to deny, too, because itâs a âdeepfakeâ image.
Indeed genuine video evidence can be so denounced so that we are thrown back on our preconceptions. Awash in a world of irrefutable lies we can believe nothing.
I hadn’t thought of that, but yes, this could be another consequence.
Indeed, and it potentially provides a blackmailers charter. Donât bother to find out a discreditable secret of a celebrity just creat your own video of the celebrity or non-celebrity engaged in some nefarious act that will be hard to refute but will get him or her sacked or their careers and lives ruined. From time to time you hear of someone committing suicide because some lowlife has threatened on the internet to expose them for watching porn unless the send them bitcoin.
Deepfake simply ups the pressure if it has developed to a state that reality and fake canât be distinguished.
Or it may have the opposite effect; once everyone becomes accustomed to fakery indistinguishable from reality it will no longer so destroy lives.
Equally concerning as video evidence of crimes will not destroy lives that should be destroyed. Will a jury believe the evidence of their own eyes.
True. But then it wonât destroy lives because all acts once ruled as criminal will no longer carry the same moral opprobrium, or shock, as they did. Some rabbit hole.
exactly, this is just the tip of the iceberg.
One issue doesn’t preclude another.