Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Steve Mollman

Mycologists warn of ‘life or death’ consequences as foraging guides written with A.I. chatbots crop up on Amazon

(Credit: WILLIAM WEST/AFP via Getty Images)

Field guides have always varied in quality. But with some of them now being written with artificial intelligence chatbots—while appearing to be authored by human experts—the possibility of readers getting deadly advice is increasing. 

The New York Mycological Society recently posted a warning about Amazon and other retailers offering mushroom foraging and identification books written with A.I. “Please only buy books of known authors and foragers, it can literally mean life or death,” it wrote on X. 

It shared another post describing such guidebooks as “the deadliest AI scam I’ve ever heard of...the authors are invented, their credentials are invented, and their species ID will kill you.” 

Recently in Australia, three people died after a family lunch, with authorities suspecting death cap mushrooms as the culprit. The invasive species originated in the U.K. and parts of Ireland but has spread in Australia and North America, according to National Geographic. It’s difficult to distinguish from an edible mushroom.

“There are hundreds of poisonous fungi in North America and several that are deadly,” Sigrid Jakob, president of the New York Mycological Society, told 401 Media. “They can look similar to popular edible species. A poor description in a book can mislead someone to eat a poisonous mushroom.”

Fortune reached out to Amazon for comment but received no immediate reply. The company told The Guardian, “We take matters like this seriously and are committed to providing a safe shopping and reading experience. We’re looking into this.”

The problem of A.I.-written books will likely increase in the years ahead as more scammers turn to chatbots to generate content.

Last month, the New York Times reported on travel guidebooks written by chatbots. Of 35 passages submitted to an artificial intelligence detector from a firm called Originality.ai, all of them were given a score of 100, meaning they almost certainly were written by A.I. 

Jonathan Gillham, the founder of Originality.ai, warned it's "dangerous" if such books encourage readers to travel to unsafe places.

And it’s not just books, of course. Recently a bizarre MSN travel article created with “algorithmic techniques” listed a food bank as a top destination in Ottawa, telling readers, “Consider going into it on an empty stomach.”

Leon Frey, a field mycologist and foraging guide in the U.K., told The Guardian he spotted serious flaws in the mushroom field guides suspected of being written by A.I. Among them: referring to “smell and taste” as an identifying feature. “This seems to encourage tasting as a method of identification,” he said. “This should absolutely not be the case.” 

The Guardian also submitted suspicious samples from such books to Originality.ai, which reported that each had rating of 100% on its A.I.-detection score.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.