Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
National
Amy Beth Hanson

Reporter caught using artificial intelligence to create fake quotes and stories

ASSOCIATED PRESS

Support truly
independent journalism

A reporter at a small local newspaper has resigned after a competitor discovered he was using artificial intelligence to write stories and fabricate quotes.

A quote from Wyoming's governor and a local prosecutor were the first things that seemed off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic.

The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade.

“The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved figures,” the Cody Enterprise reported. “This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly.”

After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise.

The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since said sorry and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he “failed to catch” the AI copy and false quotes.

“It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job,” Bacon wrote. He apologized that “AI was allowed to put words that were never spoken into stories.”

Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts.

Sports Illustrated was criticised last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation.

In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Obviously I’ve never intentionally tried to misquote anybody” and promised to “correct them and issue apologies and say they are misstatements,” Baker wrote, noting that Pelczar insisted his mistakes shouldn’t reflect on his Cody Enterprise editors.

After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories.

“They're very believable quotes,” Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar.

Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him.

Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out.

Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions.

Pelczar's story about a shooting in Yellowstone National Park included the sentence: “This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings.”

Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a “life lesson” at the end.

Another story — about a poaching sentencing — included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said.

Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them.

"In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated,” Michael Pearlman, a spokesperson for the governor, said in an email. “In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department.”

The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story.

It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank.

“These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said.

Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI “the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another. It's the ugly part of the job. But, a company willing to right (or quite literally write) these wrongs is a reputable one."

Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will “have longer conversations about how AI-generated stories are not acceptable.”

The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy.

Bacon plans to have one in place by the end of the week.

“This will be a pre-employment topic of discussion,” he said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.