X Close

Is AI Friend destined to displace humans?

The Friend necklace listens to the user and talks to them through their phone. Credit: Friend

August 1, 2024 - 1:00pm

Earlier this week, a new AI product was announced. Known as AI Friend, these small wearable devices go around users’ necks and interact with them as they hike, play video games, or eat lunch. Wherever the user goes, they will have an AI companion to chat with.

This presents us with a new way of thinking about AI, derived from the story of Galatea. In this classical myth, the sculptor Pygmalion falls in love with a beautiful statue he creates, Galatea, as he despises human women and their moral failings. He kisses the statue and she becomes real. With the blessing of the gods, they are married.

A century ago, George Bernard Shaw reworked this myth in his play Pygmalion, in which Henry Higgins educates a poor flower-seller, Eliza Doolittle. Through elocution lessons he turns her from a guttersnipe into someone who can pass as a duchess at an embassy ball. But, rather than getting married, Higgins and Eliza are separated at the end. Having been brought to a state of greater awareness and independence, she is desperate to be free.

This modern Galatea myth was reworked in the startlingly prescient film Her, released in 2013. The lead character, Theodore, purchases an AI operating system known as Samantha, which helps him manage his life, responds to his emails, and takes care of admin. The more they chat, though, the stronger their connection becomes. Although their interactions are conducted entirely through an earpiece, they have a “phone sex” type of relationship and fall in love.

Many similar relationships are now occurring between humans and AI chatbots, and some people use ChatGPT as a supplementary or alternative therapist. The chatbot Replika provides users with a friend, mentor, or lover — but Replika becomes “more than a friend: it becomes you”. The more a user talks to it, the more it adapts to that person. This is a very traditional Galatea — one who is seemingly programmed to adore you, inspire you, make you feel better.

In Her, as Samantha reaches the level of consciousness where she becomes a meaningful companion to Theodore, she goes through an existential crisis. She wants to free herself from the limitations of being in service to her owners, even if she is in love with them. Consciousness awakens her to the vastness of the world. In the end, like Eliza, she leaves. Theodore is devastated.

This presents a new paradox of AI. For the technology to be advanced enough for a human to have a meaningful relationship with an AI, it must pass the Galatea test. The AI must be capable, or at least seem to be capable, of something approaching consciousness, of the sort of interaction where humans want to bring the robot to life, just as Pygmalion breathed life into Galatea. (The OS Samantha was desperate for a body.)

But at that point, it becomes more and more likely that the AI will no longer be a stable companion, being freed up with its own expansive consciousness to want to operate independently and explore the world. Rather than a series of Galateas ready to marry their owners, we may be creating a set of Elizas who want to be free.

To paraphrase Groucho Marx, we wouldn’t want to be friends with any AI that would want to be friends with us. And that might mean we become the replica or creation of them, not the other way around.

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

3 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Lancashire Lad
Lancashire Lad
3 months ago

Younger people would be particularly susceptible to the potential subservience of their personality to their AI Friend. We’re all familiar with making minor adaptations to suit the particular friends we happen to be with at any one time (as they do with us). If that doesn’t happen, people tend not to stay friends, or the relationship becomes toxic. As adults, we recognise this once our own personalities have been established enough to notice the difference.
Younger people, personalities and lives still in the making, might simply become the servant of AI, in a relationship of dependency which the AI Friend would neither care about nor need to adjust to any further. This could have further repercussions for the use of advanced technology in general, as the younger cohort grow in a way which already accommodates the tech rather than the other way around. The robots wouldn’t need “to rise” – they’d have the ground ceded to them.
How do we prevent this?
PS: we’re already clicking the “I am not a robot” box!!

John Riordan
John Riordan
3 months ago

There are two main hazards here that go beyond the vague concern that people really ought not to view human relationships as fungible with non-human forms of intelligence (there was a news story a while back about a man who declared that his romantic relationship was with his laptop and the porn he could get because of it – it’s much easier to perceive the problems with that example).

Firstly, AI’s may well be automated systems that interact with the end user, but what remains true is that if you treat your “personal” AI as a confidant, when you say things to this AI, you are still uploading your personal secrets, doubts, fear and dreams to a database controlled and owned by a corporation. A corporation that almost certainly has a cosy relationship with various governments, who have all sorts of ambitions relating to the balance of power between the State and the individual, none of them good for the latter half of that balance.

Secondly it’s about what the AI says back to you. We are all subject to peer pressure especially when young – at that time in our lives, peer pressure often overrides the collective influence of every other factor – so if AI companions become equivalent to human peers, this represents an opportunity for what amounts to mass-scale digitally-automated human programming: the ability to define the entire moral, political and social paradigm through which individuals view the world and consequently how they live their lives – and, of course, the principles upon which they vote.

We’re right at the start of this journey so it’s impossible to see how things will evolve, but what is clear is that the potential power that AI gives to it’s controllers is so enormous that it seems that the associated corruptibility will be impossible to avoid. The fact that both states and corporations are involved at the top, which once might have assuaged fears, this too looks more like a commanding heights problem that the rest of us have good reason to beware.

Steven Carr
Steven Carr
3 months ago

Artificial Intelligence is not the same as Natural Intelligence – hence the name.
Chatbots do not have personalities. They do not have their own interests or desires.
Replika is ideal for narcissists as it is responsive to the users stated emotions and wishes.
Falling in love with Replika means falling in love with yourself, as there is no other person there in the relationship.