Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Rachel Hall

Heavy ChatGPT users tend to be more lonely, suggests research

A woman using ChatGPT.
Users who engaged in the most emotionally expressive personal conversations with the chatbots tended to experience higher levels of loneliness. Photograph: REX/Shutterstock

Heavy users of ChatGPT tend to be lonelier, more emotionally dependent on the AI tool and have fewer offline social relationships, new research suggests.

Only a small number of users engage emotionally with ChatGPT, but those who do are among the heaviest users, according to a pair of studies from OpenAI and the MIT Media Lab.

The researchers wrote that the users who engaged in the most emotionally expressive personal conversations with the chatbots tended to experience higher loneliness – though it isn’t clear if this is caused by the chatbot or because lonely people are seeking emotional bonds.

While the researchers have stressed that the studies are preliminary, they ask pressing questions about how AI chatbot tools, which according to OpenAI is used by more than 400 million people a week, are influencing people’s offline lives.

The researchers, who plan to submit both studies to peer-reviewed journals, found that participants who “bonded” with ChatGPT – typically in the top 10% for time spent with the tool – were more likely than others to be lonely, and to rely on it more.

The researchers established a complex picture in terms of the impact. Voice-based chatbots initially appeared to help mitigate loneliness compared with text-based chatbots, but this advantage started to slip the more someone used them.

After using the chatbot for four weeks, female study participants were slightly less likely to socialise with people than their male counterparts. Participants who interacted with ChatGPT’s voice mode in a gender that was not their own for their interactions reported significantly higher levels of loneliness and more emotional dependency on the chatbot at the end of the experiment.

In the first study, the researchers analysed real-world data from close to 40m interactions with ChatGPT, and then asked the 4,076 users who had those interactions how they felt.

For the second study, the Media Lab recruited almost 1,000 people to take part in a four-week trial examining how participants interacted with ChatGPT for a minimum of five minutes each day. Participants then completed a questionnaire to measure their feelings of loneliness, levels of social engagement, and emotional dependence on the bot.

The findings echo earlier research, for example in 2023 MIT Media Lab researchers found that chatbots tended to mirror the emotional sentiment of a user’s messages – happier messages led to happier responses.

Dr Andrew Rogoyski, a director at the Surrey Institute for People-Centred Artificial Intelligence, said that because peoplewere hard-wired to to think of a machine behaving in human-like ways as a human, AI chatbots could be “dangerous”, and far more research was needed to understand their social and emotional impacts.

“In my opinion, we are doing open-brain surgery on humans, poking around with our basic emotional wiring with no idea of the long-term consequences. We’ve seen some of the downsides of social media – this is potentially much more far-reaching,” he said.

Dr Theodore Cosco, a researcher at the University of Oxford, said the research raised “valid concerns about heavy chatbot usage”, though he noted it “opens the door to exciting and encouraging possibilities”.

“The idea that AI systems can offer meaningful support — particularly for those who may otherwise feel isolated — is worth exploring. However, we must be thoughtful and intentional in how we integrate these tools into everyday life.”

Dr Doris Dippold, who researches intercultural communication at the University of Surrey, said it would be important to establish what caused emotional dependence on chatbots. “Are they caused by the fact that chatting to a bot ties users to a laptop or a phone and therefore removes them from authentic social interaction? Or is it the social interaction, courtesy of ChatGPT or another digital companion, which makes people crave more?”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.