Two decades ago, long before artificial intelligence and algorithms were a part of everyday life, computer engineer Marian-Andrei Rizoiu was fascinated by machine learning.
The 39-year-old Romanian migrant completed a Masters in the specialist field in 2004 - "before it was cool", he laughs - immersing himself in the development and study of statistical algorithms that can learn from data and perform tasks without explicit instructions.
"I found it cool, and that's what I pursued (with) my research, my training, and did my PhD in machine learning and artificial intelligence," he said.
Now that "cool" obsession is his career: working on the frontline of the fight against digital disinformation, including a successful Facebook experiment that uses fictitious personas.
Dr Rizoiu is an associate professor who leads the Behavioural Data Science lab at the University of Technology Sydney.
His work has earned a nomination in the 2024 Australian Museum Eureka Prizes, which recognise scientific endeavour and achievement.
He is one of three finalists in the Outstanding Science in Safeguarding Australia category, recognised for creating a sophisticated algorithm that analyses responses and reactions to misinformation and disinformation.
The Eureka judges said it was "a paradigm shift" for covert agent detection that doesn't endanger free speech.
At the Australian National University, Dr Rizoiu started building theoretical models for how information spreads in the online environment.
He used stochastic models, which estimate the probability of various outcomes while allowing for randomness in one or more inputs over time.
"And when I joined UTS as a lecturer, I was wondering if I could apply the same type of modelling, but with societal impact," he said.
"I really just wanted to be able to make a difference. So I reached out to colleagues within social sciences, the true experts, people who spend their entire lives immersed in misinformation communities and understanding the phenomena there."
He said he posed a simple question: "Can we merge the quantitative tools that I was building with their expertise, to scale up and actually make a difference in detecting misinformation, in understanding how impactful it is, and even designing countermeasures?"
"Essentially, we built a stochastic model-based approach, which allows us to look at how individuals react to the posts of an online user, to detect hidden agents in the spread of misinformation," he said.
Detecting these agents is notoriously difficult, because they're covert.
"And if they get detected, they will pick up their toys and they go elsewhere, but one thing they can't hide is how people react to their messaging," he said.
"We take all the activity on a social media platform and we go to scale. We can start learning patterns. We can start learning what type of content gets more traction.
"We can even understand and analyse users, which is one of the contributions within my Eureka nomination."
Debunking the misinformation or going after it backfires, "putting fuel on the fire, drawing attention towards a particular topic".
"If we deem that it's beyond the certain threshold of effectiveness, or that it's likely going to get popular, that's when we build countermeasures," he said.
Dr Rizoiu has been running a federally funded project focused on understanding the Australian misinformation environment.
"We've done a Facebook experiment where we built a couple of fictitious individuals that were like the stereotypical consumers of the misinformation that we were tracking, and we were looking at specific topics," he said.
"So we performed experiments where we presented users that match these personas with two versions of Facebook ads where everything was similar including the content of the writing, but not the style of the writing.
"We looked at the click-through rate … and it turned out we can increase the reach into these communities. We can penetrate these filter bubbles ... and twice as many people clicked through when we emulated the 'correct' style."
He said that consumers of misinformation mistrust government and government-aligned sources, so they mistrust scientists.
"We need to talk about the topics they care about and we need to package the information in such a way in which is digestible and accessible to them," he said.
The next stage of his work needed to be done in conjunction with policymakers and the producers of information, like journalists, he said.
"When you're starting to do real experiments with real people, we need to ask real questions around ethics," he said.
He said that detection, prediction and forecasting "are probably the easier problems" compared to designing countermeasures.
"Whatever solution we build, there will be countermeasures being deployed. And it's a constant arms race. And essentially the ones that win are the ones who stay on top."
Dr Rizoiu said he works "with two hats on":
"One is the hat of organic misinformation, essentially conspiracy theories, the tinfoil hat people, versus state-orchestrated and injected foreign interference".
As well as government, he is working with the UTS Centre for Media Transition.
"The real trouble I'm having with interacting with the media organisations is that the last decade has not been gentle to them, given the bleeding of funding," he said.
"Their time horizon tends to be a lot smaller than mine. So I think in years when I plan this, whereas media organisations needing to fight misinformation probably think in days.
"I believe that media organisations should be the first clients of these solutions."
Winners of the Eureka Prizes are announced on Wednesday.