Whether you like it or not, artificial intelligence is already being used to make the news. Australian media companies such as News Corp are using it to produce articles and images right now. Other, less established operations are popping up with rewrites of actual journalism that sound suspiciously like AI.
As we’ve written before, here at Crikey we aren’t particularly worried about AI producing journalism more effectively than journalists. It can’t pick up the phone or go somewhere. It swallows spin rather than sees through it. Instead of sharp and specific, it’s flat and vague. The only original angles or thoughts it possesses are those input by others.
Fundamentally, we believe that generative AI can and only ever will be able to produce a statistical approximation of journalism based on all the other pieces of journalism that have been fed into it (with or without permission). But that doesn’t mean it can’t be used as an imitation of journalists by people who don’t know or don’t care enough that it’s a poor substitute.
Crikey has been closely covering AI because we think it’s one of the most important stories going on at the moment. Our readers are interested in the technology and are always looking to understand it better. This is why we want you to know and feel what it’s like firsthand, so we’ve found a way to let you try it for yourself.
Collaborating with a creative team from companies DDB Group Melbourne and experimental AI consultancy Pow Wow Solutions, Crikey has built a tool that hands you the role of fake news baron as you spin the story either way and ramp up the sensationalism. With the Bullshit O’Meter, the power of conjuring AI slop is in your hands.
Why build such a tool? Well, one thing that AI is rather good at is regurgitating. The technology enables content production at a scale, speed and cost never before possible. When ChatGPT first came out, people amused themselves by making it rewrite the ’90s hit song “Baby Got Back” in the style of “The Canterbury Tales”, or paint Vincent van Gogh singing Christmas carols in his signature style. It can mash together the information it’s been trained with to produce something that seems new. It’s not good enough to fool anyone who spends any time on it, but it’s good enough.
When applied to journalism, regurgitation isn’t useful if you want to build an audience who expect a unique and informed perspective of what matters. But if you’re more interested in spinning and twisting the world to your liking, something like AI shines.
Earlier this year, an editor for misinformation tracking website NewsGuard paid US$105 for a developer to create a legitimate-looking US news website that automatically rewrote other outlet’s stories with a specific political bias. It rewrote articles about local marijuana legislation, high school basketball awards, even an obituary for a young woman to include praise of a local candidate. It’s that easy to create an AI-powered propaganda factory.
If your goal is impactful and insightful journalism that rises above the noise, then hire a journalist. But if your intention is to flood the zone with bullshit that’s been calibrated to spray in a general direction, then it’s hard to go past AI. It’s a breakthrough on the level of the printing press for people whose goal is to drown out insight or expertise with soulless chum.
The Bullshit O’Meter is your sandbox (with a few guardrails) to see just how easy it is to turn news into bullshit.
Hang on, you might be thinking, didn’t you guys swear off using AI?
Crikey has absolutely and categorically banned using generative AI to create the stuff that you expect us to write or make. We stand by what we’ve said: “Using AI to generate Crikey would be an insult to you, the reader, as well as everyone who has made Crikey what it is.”
This is something different. We’re not trying to pass off the slop created by generative AI as something worth a Crikey reader’s time. This is about the exact opposite. We want to help you be alert — and allergic, hopefully — to the kind of unthinking, unknowing bullshit that machines can put out.
We’ve used the current US election campaign as a test case. We started by selecting six “base” stories, pivotal moments in the campaign so far that have been covered across the globe, with the key details generally agreed upon across the spectrum.
The team at DDB and Pow Wow then took a “neutral” article (acknowledging that no article, even wire copy, can ever be truly neutral) for each of these stories and passed them through a fine-tuned LLM (large language model). The LLM is designed to find opportunities to insert political bias. As the dial is turned up, the AI system adds more and more bias, until the “neutral” base article is converted into a piece of extreme propaganda.
We’ve helped design the Bullshit O’Meter to teach you to recognise the slop that’s being pumped out. We’re using the power of AI, in a very small dose, to show you just how corrosive it can be.
So, strap on your wellies, peg your nose, and join us in the sty with the Bullshit O’Meter. There’s plenty of slop to go around.