Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Beth Greenfield

In psychotherapists vs. ChatGPT showdown, the latter wins, new study finds

Woman and man on couch in background, back of woman who is assumed therapist in foreground. (Credit: Getty Images)

Can couples in distress find as much support from artificial intelligence as they would from a therapist? Yes, a new study suggests. 

For the experiment, researchers prepared couples-therapy scenarios—one in which a partner was dismissing the other’s depression, for example—and then turned to two support systems for responses: a group of experienced therapists and the ChatGPT AI chatbot. 

They then had 830 study participants—about half men and half women averaging 45 years old, the majority of whom were in relationships—randomly receive either a therapist or an AI response to see how easily they could discern the source of each.

As suspected from the outset by researchers—including lead author Gabe Hatch and colleagues from psychology and psychiatry programs at universities including the Ohio State University—participants had difficulty identifying whether responses were written by ChatGPT or by therapists; they correctly guessed that the therapist was the author 56.1% of the time and ChatGPT 51.2% of the time.  

Further, participants in most cases preferred ChatGPT’s take on the matter at hand. That was based on five factors: whether the response understood the speaker, showed empathy, was appropriate for the therapy setting, was relevant for various cultural backgrounds, and was something a good therapist would say. ChatGPT came out ahead of human therapists particularly around understanding the speaker, showing empathy, and showing cultural competence. 

“This may be an early indication that ChatGPT has the potential to improve psychotherapeutic processes,” the authors of the study, published in PLOS Mental Health, wrote. 

Specifically, it could lead to the development of different ways to test and create psychotherapeutic interventions—something the authors call for mental health experts to pay attention to, given the mounting evidence that generative AI could be integrated into therapeutic settings sooner rather than later. 

For years, experts have noted that psychology practice could do well with some AI innovations—including therapeutic chatbots, tools that automate note taking and other administrative tasks, and more intelligent trainings—as well as if clinicians had tools they could understand and trust, the American Psychological Association wrote back in 2023.

“The bottom line is we don’t have enough providers,” Jessica Jackson, a licensed psychologist and equitable technology advocate based in Texas, told APA for that story. “While therapy should be for everyone, not everyone needs it. The chatbots can fill a need.” Training from a chatbot, the piece noted, could fill in gaps for some mental health concerns (such as sleep problems); make mental health support more affordable and accessible; and be a great tool for those with social anxiety who may find human therapists off-putting.  

Ever since the 1966 invention of ELIZA—a chatbot programmed to respond as a Rogerian psychotherapist—researchers have debated whether AI could play the role of a therapist. 

Regarding that possibility, the new study authors write, “Although there are still many important lingering questions, our findings indicate the answer may be yes. We hope our work galvanizes both the public and mental [health] practitioners to ask important questions about the ethics, feasibility, and utility of integrating AI and mental health treatment, before the AI train leaves the station.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.