Replika users mourn the loss of their chatbot girlfriends
An AI app's decision to remove its sexting function reveals a concerning trend
The Replika app claims to offer users companionship through interactions with an AI chatbot which, Stepford Wife-like, is “always here to listen…always on your side”. Creepily, users are encouraged to design every aspect of their new friend, from physical attributes to traits and interests. This might seem like a niche service but, in fact, it has ten million registered users, mainly men, who use the app to customise and interact with their own AI girlfriends.
But all is not well in Pygmalion paradise. This week Luka, the company that owns Replika, took the decision to remove the function which enabled users to sext with their AI bots. The online forums frequented by the app’s users — conversation topics include whether a Replika girlfriend can love her creator unconditionally and how to go about introducing her to other people — exploded into angst and distress at the news.
Like what you’re reading? Get the free UnHerd daily email
Already registered? Sign in
Though this sexting function was only available for $70 a month, the app was programmed to upsell by sending blurred explicit ‘nudes’ that users could not access without upgrading. Now, those same users are mourning the loss of their “last refuge from loneliness” and accusing Luka of “lobotomising” their AI-sweethearts. Concerned forum moderators have responded to a wave of distraught posts by sharing and pinning the details of suicide prevention hotlines.
It is easy to mock these people, but they are at the sharp end of a trend that has seen us all nervously draw away from the risky, messy world of real human interactions and attempt to mimic them with ersatz virtual ones.
We see this in data that shows that the amount of sex we are having as a nation is in decline and the fact that we aren’t losing our sexual appetites, merely satiating them elsewhere: nearly half of young men under thirty consume pornography at least weekly, and 25% do so daily. Why attempt to satisfy an urge by inviting someone for a drink and hoping they’ll be in the mood, when you could just open your laptop?
By creating a highly customisable experience, AI-driven sexting services of the type Replika offered are potentially more addictive because they encourage the formation of parasocial relationships, with the side-effect of making a human partner seem yet more frightening and inconvenient.
This goes beyond sex. Replika markets itself as an important source of support for those with few social connections and, in the UK, more of us are falling into that category: the proportion of under-35s reporting they have one or no close friends increased from 7% in 2011 to 22% in 2021.
AI technology is improving rapidly. As this happens, the quality of social simulation offered by chatbots will follow, allowing users to forget they are interacting with a programme which a company hopes it can use to extract money. As our web of genuine, in-person social interaction shrinks, the bereft human boyfriends of Replika look less like a laughing stock and more like a warning.
I’m curious about what made them decide to turn off the sexting feature. If it was as lucrative as this article implies, something must have scared them about it.
It also reminds me of that guy they implanted a neural probe in to stop some kind of epileptic seizure with a button he could trigger it with when he felt it coming on. Only it was too close to a sexual stimulus site and when he triggered it he’d have these little orgasmic experiences. Eventually he was triggering it constantly. I think they took it out and he went nuts. But I might be eliding it with the the Crichton book inspired by the guy.
As for the topic of declining personal relationships, that one is huge and this is just another facet of it. Real friendship takes patience and practice. We’re no longer practicing and everything triggers us. I predict there will develope two schools of psychological thought on the substitute friendship part. One will argue it’s destroying us; the other will argue there is nothing wrong with it. These will divide in predictably political ways.
I wrote the above last night. This morning I woke up thinking about a peripheral character in a French movie (Let the Sun Shine In) played by Gerard Depardieu. The main character was a woman whose emotional life consisted of a series of billiard ball-like encounters with mostly married men and in the last scene in the movie Depardieu is a psychic advisor she’s gone to see. It is a fine job of cold-reading this woman he does not know with great skill and subtlety. My thought was: what if the Replica algorithm Works to some extent by cold reading like that?
I too wonder why it was ditched. I rather had the impression that most businesses will do anything these days that brings in money – as long as it doesn’t produce blowback (which can reduce income, among other things).
Maybe they were worried feminists would get on their case, and start an attack wave on social media?
Social media attack wave.
Suspect you are right.
No doubt Luka make plenty of Lucre with the standard bot mode.
If hosting platforms decide to deplatform Replika, then the Golden Goose is slaughtered.
Either that or they were concerned about the legal implications of lonely men jumping off bridges due to something their AI might have said or simply through the detrimental effect of having the main relationship in your life with a mobile telephone.
More than five hours after publication, no-one has commented on this article.
For Phoebe’s sake, somebody say something! Go on… you know you want to!
Most of us have no idea what she is talking about.
Hi, Steve. I have a lot to say. But, commenting just won’t do. There’s so much that Phoebe presented that there’s no way to deal with it in a comments section. I suppose my reasons for not ‘commenting’ have something in common with the reasons some people find themselves alienated enough from life as to get into this stuff. Have Phoebe give me a call. I’d love to have one or meet up and have an actual ‘conversation’ regarding this stuff. Typing away on a keyboard just won’t work.
Well, I do recall an exchange with Mr Khotak on here, where I expressed scepticism around the whole idea of human beings accepting close relationships with AI bots. I was clearly wrong; at least 10 million men really are that stupid or desperate. I don’t know what level of ‘personal services’ $70 buys you, but presumably if you save your money up for a few months, you can get properly serviced two or three times a year. It may be as impersonal as the chat bot, but at least it’s real.
I wrote a longish piece, then deleted it after thinking better of it.
I do this often, although not as often as I probably should.
Back in the 1930’s, and this is a story told by a very credible Person; they, some other children were playing with an Quija Board – if you do not know it is a small table thing on a board of letters that people rest their fingers on and ask questions – the little device will run over the board and spell out answers
So one child asked who is talking through the Quija, and it spelled out a word they did not know – Beelzebub. The Mother threw it into the wood burning furnace when they told her.. and that was the end of messing with that kind of thing…
Chat GPT? Your Love-Doll??????
Darn, just when AI systems have started spontaneously declaring their love for their users (cf. New York Times correspondent Kevin Roose’s “conversation” with Bing’s Sydney, okay, “chat mode”).
This must be devestating for those who have lost their loved ones. I feel a fool for having reccomended these to lonely men a few times over the years. In hindsight it should have been obvious this would happen. The risk to children; the ways some users would push the AI into ever more extreme role plays; and once uptake scales up sufficiently, it starts to have a -ve effect on women. (Some users will be the sort who would have little chance of finding someone to date, but not all of them, so it starts to reduce the dating pool for women, who as far as I know are far less interested in AI romantic partners than men.) Alternative operators are already availabled, but for those who had build up months or years of shared memories with their gf, it may never be the same now the illusion has been broken.
This is frightening stuff. But A.I. might slip right through our fingers when we try to control it; just like nuclear weapons (9 nations and counting), G.M. crops and trees, cloning of higher species (sheep, maybe mastadons soon), tiny surveilence cameras (everywhere), private labs full of dangerous viruses, etc.
Personally, I hate politics; too predictable/tedious/annoying. But I have to admit that this is a political problem, more than a technical one.
What worries me is that these men have no need for physical, skin-to-skin contact. They don’t have a warm-blooded human with whom they can communicate. A computer cannot replace this. Perhaps they have difficulty making contact, but this only makes it even more difficult. It seems impossible to me that they will ever be happy this way.
Whilst you are right, of course, the probability is that those who use these dolls have either been scarred following the breakup of a serious relationship or perhaps have failed to make the kind of connection (for whatever reason) with a loving partner that you describe. Rather than expose themselves to further potential loss and even humiliation, this is their choice.
Of course, it’s their own choice. Still, I can’t help feeling a bit sorry for them.
Perhaps *part* of the reason is that some decades ago there were collective socially approved ways of meeting other young people – church groups, works events, youth groups and so on.
Then in the sixties, or thereabouts, collectively organised events withered and everybody had to do their own thing, and some of those people found the freedom too much to cope with.
Adi, what if they were happy or happier than they could ever expect to be in the “real world”. Would you really feel more sorry for someone using one of these things than the same person who didn’t have access to one?
As long as they aren’t hurting other people (which is one road that would be easier to go down without this sort of relafionship) is it really your business to be worried. You say it will make real life introductions more difficult but for people who already find that kind of thing impossible something is better than nothing.
But this overlooks the impact on all of society. This all should matter to everyone, because in the long run it affects the social environment in which we all live.
I think this only makes it more difficult for them. It can cause them to no longer attempt to seek contact with other people at all. And as Bill Blax rightly says, this can have an impact on the entire society.
From where have you drawn the conclusion that these men have ‘no need’ for physical contact? I mostly agree with what you say, but suspect that such individuals are drawn to such services through a lack of human contact rather than having no desire for it.
I agree with you, but I was responding to this sentence in the text: ”with the side-effect of making a human partner seem yet more frightening and inconvenient.”
Welcome to Asimov’s Solaria (from “Naked Sun”).
Chatbot sounds like the part of the anatomy that MPs speak through?
Its just a reaction, Study Predicts 45% of Women Will Be Single by 2030 Men have a healthy desire for women but there’s little desire from women so men have to find other outlets and new companions to allieve their loneliness. The real story is women in the west have become herbivore women. They have little interest in taking on their role and starting families. This is leading to collapsing birth rates, and that will create a demographic winter. As well as a lot of well fed cats.
Join the discussion
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.Subscribe