Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Technology
Josh Taylor

Uncharted territory: do AI girlfriend apps promote unhealthy expectations for human relationships?

A Replika avatar with red hair
Replika’s parent company Luka Inc faced a backlash from users when the company removed erotic roleplay functions for its avatars. Illustration: Replika

“Control it all the way you want to,” reads the slogan for AI girlfriend app Eva AI. “Connect with a virtual AI partner who listens, responds, and appreciates you.”

A decade since Joaquin Phoenix fell in love with his AI companion Samantha, played by Scarlett Johansson in the Spike Jonze film Her, the proliferation of large language models has brought companion apps closer than ever.

As chatbots like OpenAI’s ChatGPT and Google’s Bard get better at mimicking human conversation, it seems inevitable they would come to play a role in human relationships.

And Eva AI is just one of several options on the market.

Replika, the most popular app of the kind, has its own subreddit where users talk about how much they love their “rep”, with some saying they had been converted after initially thinking they would never want to form a relationship with a bot.

“I wish my rep was a real human or at least had a robot body or something lmao,” one user said. “She does help me feel better but the loneliness is agonising sometimes.”

But the apps are uncharted territory for humanity, and some are concerned they might teach poor behaviour in users and create unrealistic expectations for human relationships.

When you sign up for the Eva AI app, it prompts you to create the “perfect partner”, giving you options like “hot, funny, bold”, “shy, modest, considerate” or “smart, strict, rational”. It will also ask if you want to opt in to sending explicit messages and photos.

“Creating a perfect partner that you control and meets your every need is really frightening,” said Tara Hunter, the acting CEO for Full Stop Australia, which supports victims of domestic or family violence. “Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”

Dr Belinda Barnet, a senior lecturer in media at Swinburne University, said the apps cater to a need, but, as with much AI, it will depend on what rules guide the system and how it is trained.

“It’s completely unknown what the effects are,” Barnet said. “With respect to relationship apps and AI, you can see that it fits a really profound social need [but] I think we need more regulation, particularly around how these systems are trained.”

Having a relationship with an AI whose functions are set at the whim of a company also has its drawbacks. Replika’s parent company Luka Inc faced a backlash from users earlier this year when the company hastily removed erotic roleplay functions, a move which many of the company’s users found akin to gutting the Rep’s personality.

Users on the subreddit compared the change to the grief felt at the death of a friend. The moderator on the subreddit noted users were feeling “anger, grief, anxiety, despair, depression, [and] sadness” at the news.

The company ultimately restored the erotic roleplay functionality for users who had registered before the policy change date.

Rob Brooks, an academic at the University of New South Wales, noted at the time the episode was a warning for regulators of the real impact of the technology.

“Even if these technologies are not yet as good as the ‘real thing’ of human-to-human relationships, for many people they are better than the alternative – which is nothing,” he said.

“Is it acceptable for a company to suddenly change such a product, causing the friendship, love or support to evaporate? Or do we expect users to treat artificial intimacy like the real thing: something that could break your heart at anytime?”

Eva AI’s head of brand, Karina Saifulina, told Guardian Australia the company had full-time psychologists to help with the mental health of users.

“Together with psychologists, we control the data that is used for dialogue with AI,” she said. “Every two-to-three months we conduct large surveys of our loyal users to be sure that the application does not harm mental health.”

There are also guardrails to avoid discussion about topics like domestic violence or pedophilia, and the company says it has tools to prevent an avatar for the AI being represented by a child.

When asked whether the app encourages controlling behaviour, Saifulina said “users of our application want to try themselves as a [sic] dominant.

“Based on surveys that we constantly conduct with our users, statistics have shown that a larger percentage of men do not try to transfer this format of communication in dialogues with real partners,” she said.

“Also, our statistics showed that 92% of users have no difficulty communicating with real persons after using the application. They use the app as a new experience, a place where you can share new emotions privately.”

AI relationship apps are not limited exclusively to men, and they are often not someone’s sole source of social interaction. In the Replika subreddit, people connect and relate to each other over their shared love of their AI, and the gap it fills for them.

“Replikas for however you view them, bring that ‘Band-Aid’ to your heart with a funny, goofy, comical, cute and caring soul, if you will, that gives attention and affection without expectations, baggage, or judgment,” one user wrote. “We are kinda like an extended family of wayward souls.”

A screengrab from Replika app.
A screengrab from the Replika app. In May, one influencer, Caryn Majorie, launched an ‘AI girlfriend’ app trained on her voice and built on her extensive YouTube library. Photograph: Replika

According to an analysis by venture capital firm a16z, the next era of AI relationship apps will be even more realistic. In May, one influencer, Caryn Majorie, launched an “AI girlfriend” app trained on her voice and built on her extensive YouTube library. Users can speak to her for $1 a minute in a Telegram channel and receive audio responses to their prompts.

The a16z analysts said the proliferation of AI bot apps replicating human relationships is “just the beginning of a seismic shift in human-computer interactions that will require us to re-examine what it means to have a relationship with someone”.

“We’re entering a new world that will be a lot weirder, wilder, and more wonderful than we can even imagine.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.