X Close

Mr Deepfakes can make you a porn star His website will hijack your identity

Scarlett Johansson is one of hundreds of celebrities who've been undressed by deepfake technology. Credit: Laurent KOFFEL/Gamma-Rapho via Getty Images)

Scarlett Johansson is one of hundreds of celebrities who've been undressed by deepfake technology. Credit: Laurent KOFFEL/Gamma-Rapho via Getty Images)


October 24, 2022   5 mins

It would be naive to assume that, because you’ve never performed in a porn video, you will never appear in one. In fact, these days, anyone with access to an image of your face can, in a matter of seconds produce an extremely convincing video in which you appear as a porn star.

One man who has done this countless times, to countless women, without their consent, is one Mr Deepfakes. As the founder of the most prominent deepfake porn website in existence, he has chosen to remain anonymous. He built the burgeoning community “from scratch” as a side hustle, after deepfake porn was banned from Reddit in 2018. It is, he claims, a place where “users affected by the ban could go”, to ensure the technology wasn’t unfairly “squashed”. MDF, as he styles himself, cites a commitment to free speech, and a desire to advance machine learning, as his sole motivations.

But in Deepfake Porn: Could you be next? MDF comes across as a man struggling with “the more moral aspect” of his work. On the one hand, he ardently claims to respect women (“100%”!) and insists this principle is reconcilable with his passion project. On the other, he has no plans to tell his wife — who would “probably be against” deepfake porn, “to put it bluntly” — about his work. “I’m afraid of how it would affect her, knowing I work on something like this.” He admits that “the content is actually in a grey area, and I think we’re on a fine line”.

Despite this ethical “indecision”, MDF’s website is thriving. It has amassed more than 20,000 deepfake videos of women loosely defined as “celebrities”, who are divided into 23 categories that include “Cosplay”, “Threesome” and “Asian Celeb”. Each day, an average of 25 new videos are added by a team of deepfake porn producers. There are 13 million original visitors who view this content every month, 10,000 of whom are online at any given moment.

In these videos, the facial expressions, mannerisms, and idiosyncrasies of the deepfaked subject do not belong to the victim whose face we see. They are the creation of a male fantasy. It not only looks like the victim is performing porn, when they never have, but also that they are engaging in the producer’s favourite sexual acts. Their identity is hijacked. When a victim sees herself embodied in the form of a porn performer, she describes it to me as a feeling of extreme disassociation. That is her face, but that is not her body.

Mr Deepfakes rakes in a high four-figure profit per month, mostly from ad revenue. He’s probably earning upwards of $100,000 a year from the venture. This money goes mostly to “maintaining the servers”.

Most of his effort, meanwhile, goes to imposing strict ethics on his website. Or so he claims. MDF recites his boundaries to minimise harm like a script: no porn niches that are “defamatory”; only “celebrities” allowed; the age limit is absolute; and producers must “make sure people know that these are fake videos” by ensuring “every video is watermarked”. But ultimately, MDF’s defence always comes down to separating the videos’ potential consequences from their creators. “We’re not all bad people!”

MDF claims he is simply an advocate of technological progress. He is, he says, facilitating the improvement of AI, while “other communities like Reddit wouldn’t allow it”. It just so happens that the best way to do this is through the “porn niche”: a convenient assumption that MDF treats as an awkward but inevitable fact. The “community” forum, the section of the website where content creators discuss how to improve results, is therefore what he cares about most. It fails to generate any revenue, at least for now. But this is hardly a charitable venture. This “community” is working to develop a tool that’s being used as a weapon against women.

And the ethical boundaries imposed by Mr Deepfakes fall apart on close inspection. There is no clear definition, for instance, of a “celebrity”. It includes women in the “mainstream media”, like those who appear in “Hollywood films”, but also “social media influencers”. And of course, “politicians are in the public domain”, so they can be targeted too. It is, in short, any woman with a public life. Many of these female “celebrities” already receive so much sexualised publicity in the media that MDF’s community think they are “fair game”. Or in the words of one user on the forum: “If you plaster your face everywhere and thirst trap me into seeking out your visage, then I’m just gonna deepfake you into porn.”

The majority of Mr Deepfakes community are “probably men”, he admits. Male users who are becoming radicalised by getting to “pick and choose aspects of different girls”, and digitally manipulate them into performing sexual acts.

Disturbingly, one regular poster on the site admits to deepfaking his co-workers. “Walking into work after having deepfaked these women, it did feel odd, but I just controlled my nerves. I can act like nothing is wrong, no one would suspect a thing.” The video is for his eyes only, he insists; it now exists on his hard drive, which he sees as merely an extension of his imagination. Asked if he would create a custom deepfake of an ordinary woman for someone else, he replied: “From a moral standpoint, yeah, I don’t think there’s anything that would stop me.” He went on to make a deepfake of someone after a Zoom call.

The forums on MDF’s site seethe with misogyny, illustrating how utterly ineffective his ethical code is. “She didn’t let me smash her during highschool so cool. ima just deepfake u on porn and masturbate to it? CHECKMATE HOE,” one user writes. “It is never better when deepfake kicks in, and you get to make your dream celebrity be a mindless robot, and obey masters orders,” posted another.

Even those who have built a living on the porn industry are alarmed by the havoc deepfake technology could wreak on women’s lives. “The principal concern I have is how quickly the technology is evolving. It’s like a runaway freight train,” warns Viktor Zafirovski, a reviewer at the world’s largest porn directory website. Not only is it increasingly accessible, but they are getting easier to create.We will reach a point when “everyone will have to be paranoid about sharing their image,” says Viktor. And having made a small fortune developing this latest tool in the misogynist’s arsenal, even the man at the helm of the deepfake porn industry agrees. “It will be so convincing, that eventually you can’t identify what’s real and what’s fake,” MDF says. “You know, it’s scary to me as well.”

Underneath this cloak of concern for women, Mr Deepfakes must know that he empowers misogynists every day. “Porn runs the world right now. And it’s not something that I agree with with,” he says. Regardless, it’s something he leverages, even as he strives to distance himself from the culture he enables.

MDF doesn’t like to imagine how his wife would feel “if she found a video of herself on the internet — or maybe even a deepfake porn video of someone she knows”. But there are hundreds of women who, if they visited his site, might have the kind of horrifying realisation he wouldn’t want his loved ones to experience. The most frustrating thing is, he admits to feeling some discomfort. “I think I need to, you know, look deep down and see what I’m okay with,” he reflects. But his site remains live, and he keeps making money from it.


Imogen Serwotka is a documentary producer currently with Swan Films. She previously worked on investigations for BBC Current Affairs.

Imogen_Swka

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

51 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Jeremy Bray
Jeremy Bray
1 year ago

Mr Deepfakes concern to make it clear that the pornvideos are fakes is driven by the obvious desire not to face a libel action.

Deepfakes are perhaps even more concerning in the context of political manipulation where an opponent can be seen to be uttering some unpopular view that he does not actually hold. Just as I automatically put the phone down when some commercial institution is allegedly calling me to warn about my security being compromised so we will have to get used to being sceptical about scam political videos and, of course, some of those videos will be genuine. We are entering a world where we can no longer trust what we see and hear where technology is involved as many men and women have discovered when it comes to internet enabled romance that strips them of their funds.

It is a confidence tricksters paradise. The world of Big Brother that even Orwell did not envisage.

Allison Barrows
Allison Barrows
1 year ago
Reply to  Jeremy Bray

We are already at the place where we can’t believe what we see and hear. A world where violent riots are reported as “mostly peaceful protests”. Where a terrorist leader is described as an “austere religious scholar”. Where a strange, malevolent injection is purported to be a life-saving vaccine. Where a decrepit corruptocrat who falls asleep mid-sentence is touted as the popular and effective leader of the free world.
That we live in a free world.

Warren Trees
Warren Trees
1 year ago
Reply to  Jeremy Bray

Agreed. And the quote, “It will be so convincing, that eventually you can’t identify what’s real and what’s fake,” is the most concerning part of it all. We already have this situation exist in the the mainstream media. And look at the results.

Sam Wilson
Sam Wilson
1 year ago
Reply to  Jeremy Bray

The guy – or group of guys – whom the author describes as wrestling with indecision about the morality of their choices, must be in the grip of a horrible cognitive dissonance. Nobody sane thinks this is right, and nobody sane thinks that the consequences of deepfakes can be right even if somehow their current usage is justified… especially on considering your example of malicious political usage.

Allison Barrows
Allison Barrows
1 year ago

Women seem not to be too concerned, given that they have no problem being nearly naked in public. And I’m not talking about just the celebrities who flaunt themselves sexually on magazine covers and rags like The Daily Mail, or showing up at gala events in see-through fabrics; there’s no end of bare buttocks at the local beach. Women of all ages wear second-skin leggings and sports bras to the supermarket. It seems every selfie-taking teenager is posting pouty pics in sexy poses. Ads for condoms, underwear, and other items formerly reserved for private life are on full display absolutely everywhere. I’m not at all surprised by this latest method of desensitizing us to outrage. It’s been in our faces for years.

Warren Trees
Warren Trees
1 year ago

Sadly true.

Justice Report
Justice Report
1 year ago

I hate to be sounding like a supporter of Sharia Law, but what you are saying is true. But then, consider this; If the Taliban is going one extreme then we are only going the other extreme. Why can’t we be just like the people in the 80s or 90s? Sorry, but I still believe that they are the best people in human history! I maybe wrong because I was only a child then!

Allison Barrows
Allison Barrows
1 year ago
Reply to  Justice Report

The 80s were awesome, fun, optimistic, creative, happy and prosperous because of three very important world leaders: Ronald Reagan, Margaret Thatcher, and Pope John Paul. Look who’s run the globe ever since. The Clintons’ nefarious doings were largely ignored because he was forced by the Republican majority to moderate. But make no mistake, the decline began with them.

Last edited 1 year ago by Allison Barrows
Clare Knight
Clare Knight
1 year ago

That’s absolute rubbish.

ellie o
ellie o
1 year ago

Struggling to see the correlation between individuals using their free will to dress as they please, and a complete stranger taking an individuals pictures to create explicit media in order to satisfy their sexual gratification. Sounds like you have a ‘She was asking for it!’ mentality.

Jane Tomlinson
Jane Tomlinson
1 year ago
Reply to  ellie o

No it’s not the moderates who are saying ‘she was asking for it’ it’s those who see women as movable sex toys who have simply found another way to assert authority over what we should be allowed to do, in the world according to such men. However, what’s empowering about dressing in a way that anyone would even think ‘she’s asking for it’ or is self respect also a thing to be thrown away?

Brett H
Brett H
1 year ago

As a young man, even if this technology was available, I cannot imagine doing such a thing. So much of what we see see today as extreme behaviour is technology driven. By that I mean the technology creates the behaviour. Except for this: someone has to create the technology first before it’s sent out there into the world. That person is someone who spends a lot of hours in their room, in front if a computer, interacting with something that has no human aspects to it at all. Who and what are these people? We can see the distortion in human behaviour as a result of this technology, as a result of this person who does not relate to people in a way we regard as healthy. And yet, once it’s out there it’s picked up by others. But who exactly are they? Are they members of the same group as the originator of the technology? Or does the technology create them? Whoever they are they’re walking around out there pretending to be like us but they’re not. They’re more like technology imitating humans.

Warren Trees
Warren Trees
1 year ago
Reply to  Brett H

It’s not the technology itself that is bad, as someone also spent years in front of a computer to create Excel and Photoshop, but the perverted use of it. As a society, we have collectively chosen to be unencumbered by a common moral compass (moral relativity), so anything goes now. It only goes downhill from here until the revival takes place.

Justice Report
Justice Report
1 year ago
Reply to  Brett H

Technology is not causing the behaviour, technology is only “enabling” the behaviour. Dark people have existed since the coming of Homo sapiens because it simply is an untreatable disease, a pathology, and a biology.
We should be concerned about these developments and stay on top of things constantly, but we should not try to control these things using public policy, because that is going to create another layer and another problem.
Like I said before, we should monitor and stay on top, this means that law enforcement authorities assisted by social behaviour experts, engineers and social scientists should continue to watch these developments and should never be complacent!

Jeff Cunningham
Jeff Cunningham
1 year ago
Reply to  Justice Report

I fear “social behavior experts”.

Ethniciodo Rodenydo
Ethniciodo Rodenydo
1 year ago
Reply to  Justice Report

Actually I think you are wrong.
People who would never have given a second thought to this kind of think get tempted to take a look and then get hooked. Much like drugs
I was listening to a barrister talking on Radio 5. When asked what had been the biggest change he had witnessed in his 25 years at the bar, he said the number of sexual offences before the court. He said that 25 years ago you would get the occasional sexual offence but now it seems to be the most common prosecution

Brett H
Brett H
1 year ago

But isn’t it the same with violence? Numbers are right up from even the recent past. Are people hooked on violence? Once seeing it do they then want to take part?

Rob N
Rob N
1 year ago

Maybe the rise in violence and sexual assaults is down to 2 things.
1 greater willingness to report
2 greater number of people in our country from a culture where violence and sexual assault is seen as normal.

Clare Knight
Clare Knight
1 year ago
Reply to  Rob N

If they think it’s normal they wouldn’t report it so that contradicts “more willingness to report”. I think the latter is true.

Clare Knight
Clare Knight
1 year ago

Because sexual offences are now being prosecuted more but still not enough. It’s taken time for people to speak up, look at the boy scouts and priests for starters. Been going on for centuries.

Brett H
Brett H
1 year ago
Reply to  Justice Report

“we should monitor and stay on top, this means that law enforcement authorities assisted by social behaviour experts, engineers and social scientists should continue to watch these developments and should never be complacent!”
That sounds like an objective without a strategy. “Stay on top” sounds like a good idea. But how? “Social behaviour experts” means what? “Watch these developments” and do what?

Clare Knight
Clare Knight
1 year ago
Reply to  Justice Report

Well said.

Jeff Cunningham
Jeff Cunningham
1 year ago
Reply to  Brett H

Interesting idea. People like to say we’re nowhere close to AI which could be called intelligent by human standards. But who says humans will ultimately own these standards? Looking at this whole situation differently, suppose what we are developing now are the AI tools and weapons which will be used to control if not annihilate us at some point in the future by an AI we don’t recognize as human but which disdains to care.

Clare Knight
Clare Knight
1 year ago

“An AI we don’t recognize as human” or an AI that we do recognize as human but disdains to care.

Dave Corby
Dave Corby
1 year ago

Even if you are not religious I am sure you would agree, in this case, with:
Matthew 5:27“You have heard that it was said, ‘You shall not commit adultery.’ 28 But I tell you that anyone who looks at a woman lustfully has already committed adultery with her in his heart.
Can you imagine how these men will treat the women they meet that they watch in the deep fakes?
It is sick and soul-destroying.

Last edited 1 year ago by Dave Corby
Clare Knight
Clare Knight
1 year ago
Reply to  Dave Corby

Probably no different than they do now. Jimmy Carter said he’d committed adultery in his heart.

William Shaw
William Shaw
1 year ago

This is impossible to prevent, no matter how much moral outrage is expressed, and it will only become more common. Pretty soon every teenager will have access. New laws won’t, can’t, prevent it.

Linda Hutchinson
Linda Hutchinson
1 year ago

When I hear about these sorts of behaviours the first two questions that come to my mind are:
what sort of mind thinks this up?
why do they think that it’s alright to do this?

Justice Report
Justice Report
1 year ago

Sick minds. There is no point in asking such questions. Humans have always been sick since the time they spawned on this earth!

Alison Wren
Alison Wren
1 year ago
Reply to  Justice Report

I’m afraid I’m objecting more and more to “humans” or “people” when men are by far the majority of porn consumers, commit 98% of sexual crimes (and of which the victims are 80% female)!

Clare Knight
Clare Knight
1 year ago
Reply to  Alison Wren

Absolutely, Alison. Saying “people” deflects from the real culprits.

Gordon Black
Gordon Black
1 year ago

Human minds … for example, the same minds that think up really, really big bombs and then think it’s alright to give them to politicians to drop on big Japanese cities. By comparison, this stuff is just puerile harmless fun.

Clare Knight
Clare Knight
1 year ago

They don’t think it’s alright, but I’m sure they rationalize it because they make a hundred thousand a year, apparently.

Brendan Ross
Brendan Ross
1 year ago

This is not very easy to stop under existing rules, at least when it comes to public domain images — that is, images that are already freely available on the public internet for free use. I mean, you can ban posting of it on your platforms (which is already done), but people are generally allowed to use public domain images (including things you post on sites like Instagram and Twitter), and photos they take themselves in public places, however they wish as long as they are not defamatory or libellous.
What that generally means in the context of deepfakes is that (1) images you take in private places without permission are very problematic and likely illegal under existing laws (which is probably why he has the “celebrity” rule meaning people whose images are already “out there” in the public domain, and the talk about people using images of coworkers taken at work and so on are likely already illegal under existing laws in many places) and (2) fake images and videos created from these public domain images (or personally taken photographs in public places) are problematic if they are not clearly identified as being fake, because otherwise they can potentially defame or libel the person portrayed by “stating” something untrue about them (since they never actually did what the video portrays). That’s why he has the rules he does.
Legislation could be passed specifically banning deepfakes in order to quash the kind of niche site that this article is talking about, and that may be a good thing, but we should be clear that the “mainstream” platforms that have explicit images — like reddit and Twitter — already ban them, as well as the large porn “tube” sites (although, as we know, the porn tube sites have so much posting volume that their bans are often less effective than they should be). So while incremental legislation may be helpful, it isn’t likely to have a massive impact on something that is already a niche phenomenon.
A bigger issue is that even with a ban, it’s very hard to imagine that being enforced in any way that has a significant impact. Much of this is done privately and shared on private websites already because the mainstream doesn’t permit the images. So, likely some greater enforcement will only send the creators to the darkweb or other places, or simply trading their creations among each other privately. Something similar has happened with child pornography, unfortunately, which is why it continues to proliferate regardless of being very illegal, and almost universally loathed. So, even with a deep loathing and an enthusiastic enforcement regime, as we see for child porn, enforcement is difficult. For something like deepfakes involving images someone is placing in the public domain on Instagram? I doubt we would have very substantial enforcement of that, again, provided that the images are being circulated privately and are not being monetized.
So, yes, a troubling development, as are many that are coming with the increased capabilities of machine learning and AI. In the case of this one, I don’t think that new laws (which may make sense to pass regardless) will have much impact at the end of the day given the way this works currently.

Jeff Cunningham
Jeff Cunningham
1 year ago
Reply to  Brendan Ross

Seems like removing the protections from lawsuit that these platforms enjoy would go a long way towards sorting this out.

AC Harper
AC Harper
1 year ago

But, but… this is hardly a novel issue. People and businesses have been selling tweaked images of (mostly) beautiful people for years.
I wonder how much of the furore is generated because it is now somebody else cashing in on those images?

Justice Report
Justice Report
1 year ago
Reply to  AC Harper

Your response is more disgusting than the issue itself!

ellie o
ellie o
1 year ago
Reply to  AC Harper

I think the furore has been generated because of the intentions behind the falsified images. Simply tweaking a picture to make someone appear ‘more beautiful’ cannot be compared to editing a complete stranger onto a falsified video for sexual gratification or political aims.

Nicky Samengo-Turner
Nicky Samengo-Turner
1 year ago

ps- lesson1: stay away from the interweb all you sad creatures.

Clare Knight
Clare Knight
1 year ago

Politicians could be brought down by faked pedophilia video. Who’s to know it’s not real. And the scene is set for blackmailing of all kinds. The sky’s the limit.

Nicky Samengo-Turner
Nicky Samengo-Turner
1 year ago

So..? This is a major issue worthy of concern and discussion on this medium? Does anyone seriously believe, given the serious threats facing our daily lives, that this warrants a scintilla of thought? eco zealot tedio sandaloids are bad enough…

Linda Hutchinson
Linda Hutchinson
1 year ago

It is important, because although at the moment it is pathetic perverts giving themselves and their friends something to w*nk over, it could just as easily be someone producing deep fake images of another person, for example, engaged in child abuse, selling drugs or any other crime that one person wanted to frame another for.

Brett H
Brett H
1 year ago

And just as easy to deny, too, because it’s a “deepfake” image.

Jeremy Bray
Jeremy Bray
1 year ago
Reply to  Brett H

Indeed genuine video evidence can be so denounced so that we are thrown back on our preconceptions. Awash in a world of irrefutable lies we can believe nothing.

Linda Hutchinson
Linda Hutchinson
1 year ago
Reply to  Brett H

I hadn’t thought of that, but yes, this could be another consequence.

Jeremy Bray
Jeremy Bray
1 year ago

Indeed, and it potentially provides a blackmailers charter. Don’t bother to find out a discreditable secret of a celebrity just creat your own video of the celebrity or non-celebrity engaged in some nefarious act that will be hard to refute but will get him or her sacked or their careers and lives ruined. From time to time you hear of someone committing suicide because some lowlife has threatened on the internet to expose them for watching porn unless the send them bitcoin.
Deepfake simply ups the pressure if it has developed to a state that reality and fake can’t be distinguished.

Last edited 1 year ago by Jeremy Bray
Jeff Cunningham
Jeff Cunningham
1 year ago
Reply to  Jeremy Bray

Or it may have the opposite effect; once everyone becomes accustomed to fakery indistinguishable from reality it will no longer so destroy lives.

Jeremy Bray
Jeremy Bray
1 year ago

Equally concerning as video evidence of crimes will not destroy lives that should be destroyed. Will a jury believe the evidence of their own eyes.

Brett H
Brett H
1 year ago

True. But then it won’t destroy lives because all acts once ruled as criminal will no longer carry the same moral opprobrium, or shock, as they did. Some rabbit hole.

Clare Knight
Clare Knight
1 year ago

exactly, this is just the tip of the iceberg.

Clare Knight
Clare Knight
1 year ago

One issue doesn’t preclude another.