It’s the start of 2026 and somehow, I feel like I’m back in the Seventies. Yes, we’ve seen massive technological advances since the decade of my birth, but where has that brought us? To an era of dodgy men sniggering about ladies losing their clothing, à la Barbara Windsor in Carry On Camping.
To bring in the new year, some X users have been using its AI chatbot Grok to alter real images of women. Many of the requests have been to replace everyday clothing with bikinis (though others have been much more extreme, in some cases involving children). The bikinis might seem rather quaint — a nod to the days of Page Three and Benny Hill, as opposed to the raw flesh and fluids of Pornhub and OnlyFans. The difference here — the 21st-century twist — is that none of the subjects are consenting. Indeed, that seems to be where the pleasure lies: not in a particular act or a certain body part being on show, but in knowing that this is being done to a woman without her consent.
To be fair to Elon Musk, the creation of fake, highly sexualised images of women is hardly a phenomenon unique to Grok users. Before AI there was Photoshop, and before Photoshop, there were scissors and glue. Take the head of some woman you don’t like — a teacher, a politician, always someone with authority — and stick it on the head of a half-naked body. It tells her: “Whatever you think you are, this is what I’ll always reduce you to”. At least where adults are concerned, it is about power as much as it is about sex.
This is why, even if it looks exactly the same, such an image is fundamentally different to one which a woman might mock up herself. It’s also different to a non-consensual image of a male subject (Elon Musk in a bikini does not in any way punish the owner of X; it is, at best, a crass joke, one in which women themselves are as much the punchline as Musk). As leading AI writer Madhumita Murgia notes in Code Dependent, the vast majority of targets of non-consensual pornography are female. “AI image tools,” she writes, “are being co-opted as weapons of misogyny.” British TV presenter Cathy Newman has spoken about how viewing a deepfake of herself was “dehumanising”.
Visually, the misogyny might seem old-school, but the impact of such images can be heightened, as Murgia writes, “by the technology’s ease of use, and institutional callousness: a lack of state regulation and the unwillingness of large online platforms to be accountable for their spread”. Moreover, as Jo Bartosch and Rob Jessel argue in their recent book Pornocracy, there are reasons to think that non-consent has become more attractive to men desensitised by increasingly hardcore consensual content. Victims, meanwhile, find it hard to complain without risking seeing more images of themselves created and distributed in retaliation. And once an image is downloaded, there is very little the victim can do. Whatever measures are put in place by the sites upon which it was originally shared, she will always know it is out there.
Regulation matters, but the technology cannot be put back in the box. If anything, the solution lies in asking a question which is not at all new, one that is asked whenever and wherever women seek to participate in public life. Should our inclusion be contingent on our willingness to be reduced to objects? If we insist on putting ourselves out there — on being seen at all — can we complain at the versions of ourselves that are thrown back in our faces?
The root of the problem is ideological. It comes down to beliefs about what women are and what we are for. Liberal feminists are not wrong to focus on the fact that those who create these images already advertise their disdain for female boundaries. Nonetheless, as a critique this does not go far enough.
The marriage of pornography and online technology has only been able to put progress for women in reverse because it is underpinned by a belief that womanhood is femininity, and femininity is objectification. To unpick this misconception will be difficult because, like a deepfake image, it has been shared again and again, and there’s no uncreating it. It can only be overlaid with more and more true images, and a willingness to put the shame where it truly lies: on those who might have been unreconstructed boors 50 years ago, but are even more dangerous today.







Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe