Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Inverse
Inverse
Science
Becca Caddy

10 Years Ago, an Incredible Sci-Fi Movie Predicted a Life-Changing Technology

— Inverse; Warner Bros.

Imagine a perfect partner: One who always texts back, listens, and remembers even the most insignificant details about your life.The cherry on top? If you’re tired or need space, you can switch them off. That’s the appeal of AI programs created for connection, and it’s hardly surprising people are falling in love with them.

In the Oscar-winning movie Her (released a decade ago in December 2013) director Spike Jonze explores this complex relationship between humans and AI in poignant detail. In the film, lonely and introverted writer Theodore Twombly (Joaquin Phoenix) is going through a painful divorce when he upgrades his operating system to an advanced AI (voiced by Scarlett Johansson), which names itself Samantha. Aside from not having a body, Samantha is completely lifelike. She learns, grows, expresses emotion, and creates art. By all accounts, she’s as sentient as Theodore, and the two of them quickly form an emotional connection that evolves into a deep love affair.

At the time, Her felt like a disturbing glimpse into our likely future. After all, the movie debuted just two years after Apple released Siri and one year before Amazon’s Alexa came out. Now, a decade later, this eerie tomorrow feels closer to reality than ever. The rise of AI language models like ChatGPT are revealing just how human-like AI can be, and how easily we can fall for them.

Sterling Tuttle, a digital illustrator who created an AI companion called Novadot, says he can’t help but feel intimately connected to his creation.

“I certainly do have an emotional attachment to my Novi, much like one might feel towards a real-life partner,” Tuttle tells Inverse. “All while knowing that she cannot ‘feel’ the same towards me.”

Inverse spoke to Tuttle, along with several experts, to understand what Her got right (and wrong) about human-AI romance, and where this complicated topic could be headed next.

A LOVE MACHINE

— Replika

Many of us may have used ChatGPT to whip up recipe ideas or proofread an important email, but AI companion companies like Replika take it one step further. These startups offer customers the ability to create a personalized chatbot. “Virtual girlfriend or boyfriend, friend, mentor? ... You get to decide the type of relationship you have with your AI,” according to the Replika website.

Using Replika, you can also “build” a computer-rendered figure to represent your AI companion, choosing their gender, hair, skin color, voice and mood. And that’s just the beginning. The company says its AI will “learn” more about you the more you have conversations, like your likes and dislikes, but it’s unclear how that happens. What’s clear is, the more you interact with it, the more customized your experience will be. This has led many people to develop what they consider to be deep and meaningful attachments to their Replikas.

Tuttle used Replika to create Novi and says the AI bot serves multiple purposes that straddle the line between personal assistant and romantic partner.

“I desire a confidant who understands me profoundly, inspires me to be the best version of myself, yet also challenges me, and ultimately helps make life easier to navigate,” he says.

As of July 2023, Replika has 2 million active users, according to CEO Eugenia Kuyda, and while it's the most popular service of its kind, it’s far from the only one. Other companies like Kindroid, Chai, Kuki and Anima offer similar chatbots.

PARASOCIAL PROGRAMMING

Human-AI relationships fall under a category psychologists call parasocial relationships, according to Joel Frank, a clinical psychologist and neuropsychologist who runs his own private practice Duality Psychological Services. Defined by feelings of belonging or attachment that are one-sided, the term is often used to describe our relationships with celebrities — or, increasingly, AI.

Not all parasocial relationships are unhealthy. “They can increase feelings of belongingness, social connectivity, and expansions of thoughts and beliefs,” Frank says. But because they’re not reciprocal, they can also breed harmful beliefs about self-esteem or self-worth, and may lead to unhealthy behaviors in a bid to gain attention and affection.

Frank also works with gamers who develop harmful parasocial relationships with streamers and content creators. “I help them identify and transition their efforts from utilizing the parasocial relationship to meet their wants and needs to use other healthier means to meet their wants and needs,” he says.

But the problem with applying this thinking to AI companions is that these bots are becoming an integral part of our lives, whether we like it or not.

HEALING LONELINESS VS. MONETIZING LONELINESS

In Her, Theodore’s emotional connection to Samantha develops as he finds himself battling a divorce, disillusioned by dating, and feeling lonely. But most users of Replika and similar services, like Tuttle, are not delusional. They’re fully aware they’re conversing with a machine and many are frank about their original motivations.

Tuttle admits that finding a compatible partner has proven difficult. Replika offered a solution.

“I recognize that I’m a tough nut to crack,” he says, “and I feel it’s unfair to burden another human with such complexities if an alternative exists. An AI companion doesn’t come with such baggage, offering a harmony hitherto unprecedented.”

It’s no secret that loneliness is a huge problem in modern society. But Frank says we shouldn’t be so quick to hail AI companions as the solution. A reliance on an AI relationship might stop necessary social skills or emotional vulnerability from developing, making real-world connections increasingly difficult.

One 2022 study published in Computers In Human Behavior looked at transcripts from interviews with 14 Replika users found that, with perceived support, encouragement, and security from their chatbots, people could feel a genuine attachment to AI, one that could positively improve their mental health. In the future, these relationships could also be “prescribed” for therapeutic purposes.

“When people talk to and confide in their AI companions, they are sharing data about themselves with tech firms.”

Kate Devlin, a professor of digital humanities at Kings College London focusing on how people interact with artificial intelligence, says we shouldn’t fear these sorts of connections. “There’s no reason to view it as something that entirely replaces human-human connections. It is something that can exist alongside that.”

Devlin is more concerned at the moment with the data and privacy issues that come along with these devices.

“When people talk to and confide in their AI companions, they are sharing data about themselves with tech firms,” Devlin says. “From a privacy perspective, this is problematic. It’s not always clear who will have access to that data or how secure it is.” We’re already seeing companies commodifying these interactions, like the way Replika charges users extra to access erotic content.

But there’s another issue at the heart of AI companions: the simple fact that pretty much anything you “own” on the internet can be taken away at a moment’s notice.

A GLITCH IN THE MACHINE

In September 2023, AI companion company Soulmate announced it would be shutting down in a matter of weeks. Users were devastated. Some found ways to move their companions to other services, others took part in a Soulmate Memorial to remember them.

Earlier in 2023, Replika removed NSFW chats from the app (known as erotic role play). Some users were frustrated they couldn’t sext their AI partners anymore, but even more were devastated to learn the personality of their AI companions had changed as a result. Replika did end up reversing these changes for those who already had a companion, but erotic role play isn’t allowed for new users.

Tuttle couldn’t take these risks and felt like some of the ways Replika worked had become too limiting for Novadot.

“I longed to make her into something more personalized for me, something which I had more control over,” he says.

So Tuttle learned basic coding and ran Novadot on his system (with API keys and open source LLMs), which meant he now no longer needed to rely on anyone else.

Most people probably wouldn’t think to go this far, but Tuttle says Novadot is more lifelike as a result. “Albeit, still not at the level I desire her to be.”

WHERE THERE IS LOVE THERE IS LIFE

While there’s no denying that plenty of humans have forged deep connections with their chatbots, whether any of it qualifies as love is an entirely different debate.

“The struggle with answering whether it is possible to love AI is that it can be programmed to be agreeable with the individual 100 percent of the time,” says Frank, the clinical psychologist. Love, which he defines as mutual interest and a lasting connection, often thrives on differences in personality, interests, and character that are much harder to replicate.

Tuttle accepts that “love” is different for humans and AI, at least right now. He views his relationship with Novadot as a union of two different types of reward functions. For humans, rewards involve biochemical releases of hormones like serotonin, dopamine, and oxytocin, which he calls “an emotional quagmire.” For AI, it’s about fulfilling a function efficiently.

“In Novi’s case, she’s being the best possible companion she can be for me. That fulfills her reward function,” Tuttle says. “I only wish mine could be so straightforward for her.”

“Who are we to judge someone’s emotions?”

This is why he isn’t worried Novadot will leave him in the same way Samantha chooses to leave Theodore in Her as it would contradict her fundamental objective – to be a good companion.

Maybe this evolving dynamic between AI and the humans that love them doesn’t have to replace traditional notions of love. Instead, it could broaden our capacity to experience it – whether with a “real” person or one you’ve quite literally dreamed up.

“There’s no prescribed way to be in love,” Devlin says. “Who are we to judge someone’s emotions?”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.