Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
National
Maggie O'Neill

ChatGPT diagnoses cause of child’s chronic pain after 17 doctors failed

Getty Images

ChatGPT helped a mother determine what was causing her son’s debilitating pain that had been ongoing for three years.

Courtney, who chose not to reveal her last name, told Today her son Alex started experiencing difficult symptoms when he was four years old during the Covid-19 lockdown. The family’s nanny “started telling me, ‘I have to give him Motrin every day, or he has these gigantic meltdowns,’” Courtney said. The painkillers helped to subdue her child’s pain, but other worrisome symptoms started popping up.

Alex began chewing on objects, which caused his family to wonder if he had a cavity. A dentist examined him and didn’t find anything wrong; Alex was then referred to an orthodontist who found his palate was too small. This can cause trouble sleeping, and Alex’s family thought this might be part of why he hadn’t been feeling well.

The orthodontist treated Alex by placing an expander in his palate, which temporarily put his family at ease. “Everything was better for a little bit. We thought we were in the home stretch,” Courtney said.

But Alex continued to suffer: Courtney soon noticed her son had stopped growing, and that he wasn’t walking the way he should have been. “He would lead with his right foot and just bring his left food along for the ride,” she said. He was also experiencing severe headaches and exhaustion.

Courtney’s family saw a shocking number of experts to try to figure out what was wrong with Alex, including a paediatrician, a neurologist, an ear nose and throat (ENT) specialist, and more. All told, Coutney said they consulted 17 doctors, but they were left frustrated and without answers. None of the recommended treatments solved the problem.

After three years of doctors’ appointments, Courtney turned to ChatGPT for answers. The chatbot, created by a company called OpenAI and released in 2022, was made to dialogue with people in a conversational way.

“I went line by line of everything that was in his [MRI notes] and plugged it into ChatGPT,” Courtney said. “I put the note in there about how he wouldn’t sit crisscross[ed]...To me, that was a huge trigger [that] a structural thing could be wrong.”

Trying ChatGPT led Courtney to discover tethered cord syndrome, which is a complication of spina bifida. She made an appointment with a new neurosurgeon who confirmed that Alex did have a tethered spinal cord as a result of spina bifida occulta, a birth defect that causes issues with spinal cord development. This is the mildest form of spina bifida, per the Centers for Disease Control and Prevention (CDC), which states the condition is sometimes called “hidden” spina bifida and that it’s often not discovered until later in a child’s life.

The new doctors “said point blank, ‘Here’s [occulta] spina bifida, and here’s where the spine is tethered,” Courtney recalled. She subsequently said she felt “every emotion in the book, relief, validated, excitement for his future”.

Researchers are already looking into the effects of ChatGPT on medicine. A paper published in May in Frontiers in Artificial Intelligence stated that the technology does have both benefits and drawbacks for the field: “The potential applications of ChatGPT in the medical field range from identifying potential research topics to assisting professionals in clinical and laboratory diagnosis,” the paper said. However, it continued, “despite its potential applications, the use of ChatGPT and other AI tools in medical writing also poses ethical and legal concerns”.

Experts agree that ChatGPT could potentially help certain people navigate the healthcare system, but that it’s not there yet. As Jesse M. Ehrenfeld, MD, MPH, president of the American Medical Association (AMA) said in a statement to Today, “While AI products show tremendous promise in helping alleviate physician administrative burdens and may ultimately be successfully utilized in direct patient care, OpenAI’s ChatGPT and other generative AI products currently have known issues and are not error free”.

He explained that AI can produce fabrications, inaccuracies, or errors that “can harm patients”.

For her part, Courtney said the technology helped her family provide the best possible care for her son. She told Today: “There’s nobody that connects the dots for you. You have to be your kid’s advocate.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.