Get all your news in one place.
100’s of premium titles.
One app.
Start reading
ABC News
ABC News
National

With election 2022 nearly upon us, can we actually trust the opinion polls this time?

The polls said Bill Shorten would win in 2019, so can we trust them now?

"I have always believed in miracles," Scott Morrison proclaimed nearly three years ago to a room of thrilled — yet shocked — Liberal party faithful.

His 2019 election win wasn't seen coming, in large part because opinion polls gave the expectation of a Labor win.

And we were accustomed to taking the polls as being highly accurate. After all, they had a long history of picking the election winner, time and again.

An academic inquiry would later conclude that we had seen a "polling failure", rather than "a polling miss".

Several pollsters also suggested that Mr Morrison's "miracle" came after a failure of journalists and commentators to report the uncertainty in opinion polls, leading people to think they were more precise than they really were.

"The so-called failure of polling was actually a failure of analysis and insight from everyone that purports to understand politics," Essential's Director Peter Lewis says.

So were the polls wrong in 2019, or were they just read wrong?

The answer is probably a bit of both.

The result did not go Bill Shorten's way in 2019. (ABC News: Matt Roberts)

A ‘polling failure'

Darren Pennay chaired an academic inquiry into the national polls, commissioned by the Association of Market and Social Research Organisations (AMSRO).

Some, but not all, pollsters participated in that review, although Mr Pennay summarised it as "limited participation, grudgingly given".

"I think partly what happened is [the pollsters] very strong track record leading up to the 2019 poll meant that they probably [saw] no compelling reason to change," he says.

"Whereas the headwinds and clues from overseas polling failures in the UK and US in particular, suggested that traditional methodologies that pollsters were using for their surveys needed updating."

National opinion polls get published year-round, but there's really only one time every three years that their accuracy can be gauged: on the day the only poll that really matters is held.

Comparing the final published figures from each pollster to the actual election results suggests an issue that applied nearly equally to all pollsters.

Of the five national pollsters last election, all under-estimated the actual primary vote for the Coalition by at least 2.4 percentage points, with an average "bias" of 2.8 percentage points.

On two-party preferred terms, all five of the pollsters were between 2.5 and 3.5 percentage points off the actual election result in their final polls published in the last days of the campaign.

A separate analysis, by University of Sydney political scientists, suggested the ALP's vote had been consistently overestimated for the entire three year period.

Most of the opinion polls over that period suggested the ALP had support at or above 36 per cent.

According to the analysis from Luke Mansillo and Professor Simon Jackman, which estimated the true support for the party on each day, Labor rarely if ever actually had support that high.

The difference between each polls' finding in the dots and an estimate of Labor's true support in the shaded section. (Supplied)

But we only knew that in hindsight, as fitting their model could only be done once we had the 2019 result.

Dr Jill Sheppard from the ANU, who also participated in the AMSRO review, said the biggest problem facing the polling industry is simply that "people don't want to answer polls anymore:".

"It's just harder and harder to get people to actually tell us how they're going to vote," she says.

"Even though we do our best to make sure that the samples are representative, they are never going to really be reflective of the average Australian.

"They're always going to be a little bit more politically engaged, a little bit more partisan, a little bit more ideological than your average neighbour or family member."

That's not unusual — it's virtually impossible to get a perfectly random sample when conducting an opinion poll — and pollsters do a lot of work in "weighting" their samples to ensure they are as representative as possible.

But if you don't know you have a sampling bias, you can't correct for it.

"The samples that the pollsters relied on tended to over-represent people with educational bachelor's degrees and higher, so more highly educated people were more likely to participate in the polls," Mr Pennay says.

His inquiry suggested the failure to realise this, and correct for it, was the most likely reason the polls over-represented Labor voters.

Many pollsters — including Essential and Newspoll — are now ensuring they weight their samples based on education level, but not all pollsters disclose their weighting variables.

Do polls better understand support for Anthony Albanese and Scott Morrison now? Time will tell.  (ABC News/AAP)

The horse race

The other big criticism after the 2019 election was that people were reading far too much into them, and journalists were framing too much of their coverage around them.

"We'd made polling a default scoreboard for politics, which it never was intended to be, it was never meant to be predictive," Mr Lewis says.

"I do get that the trend lines for a long period had Labor a long way ahead, but it was almost like everyone had been seduced into the fact that the polls were these soothsayers of the future."

"I think polling had gotten lucky over about a decade, because the results were pretty clear cut."

For that reason, some of the pollsters adjusted the way they report their polling.

Essential swapped from publishing a traditional two-party preferred estimate to, to what they call "two-party preferred plus".

It leaves undecided voters in the picture, rather than excluding them as all pollsters had done prior to 2019.

"We've got between 8 and 12 per cent of people that just don't know where they're going," Peter Lewis says. "For the state of elegance, polling had taken those out of the sample, so we could give you a nice 52-48."

"The reality was even in the lead up to the election, it was about 49-47 with 6 per cent undecided.

"When that happens you actually disenfranchise the disengaged … there are a whole bunch of people who seriously turn up on the day, take both bits of paper and then make their choice."

Another pollster, Resolve, which wasn't publishing polls prior to 2019, chose to not publish a two-party preferred figure at all.

Instead, Resolve and the Nine newspapers which publish them chose to focus their reporting more on public opinion on current issues rather than voting intention.

"You still get some reporters that try to build a grand narrative around a 1 or 2 per cent poll movement," Peter Lewis says.

"You've got other newsrooms — who I accept are strapped for resources — but who will uncritically run a news story off a robopoll.

"Something that moves a couple of per cent doesn't actually mean anything at all."

The margin of error

Credible pollsters will always publish a margin of error around each poll estimate, giving you a range of numbers that could reasonably be the true support for each party.

It varies from poll to poll, but Australian national polls tend to have margins of error between about 2 and 3 percentage points.

In other words, if a poll says a party has 50 per cent support, what it's really saying is that the party's support lies somewhere between 47 and 53, with numbers closer to 50 being more likely.

"To expect the polls to make or to be used, or be able to support a precise prediction of who's going to win the election is really a step too far," Darren Pennay says.

There's another problem, too, with that margin of error. It assumes that the people you're polling are a truly randomly chosen group of voters.

And for the most part, pollsters don't just visit people randomly to ask their opinions. They draw their samples from large panels they've built up over time.

One of the other difficulties is that polling is getting harder, and more expensive, at precisely the time the media industry is contracting and becoming less willing to pay for it.

"When we talk about random samples and probability samples, we're talking about a golden era from the ‘50s or ‘60s that just doesn't exist anymore," Dr Sheppard says. "When we're looking at opinion polls now, maybe 5 per cent of the people who we contact actually give us an answer about how they're going to vote.

"Swinging voters are the most interesting people in our political system. They don't think like people who are rusted-on voters, but also they're much less likely to answer the call when a pollster rings."

Mr Pennay says the margins of error being quoted can give a "sense of false precision".

"They use a shorthand form of ascribing a margin of error to their polls, which isn't really appropriate," he says.

Transparency

One of the good things to come in the wake of the 2019 election was a commitment to more transparency from many pollsters.

The Australian Polling Council was established last year, and membership requires pollsters to publish some basic information about how their public polls were conducted. 

That disclosure includes the full wording of questions, sample sizes and method of conducting the poll, and some basic information about the post-survey adjustment methods.

"Before 2019, the Australian polling companies were notoriously secretive, they didn't talk to academics, they didn't really reveal any of their methods to each other, or to the media who are writing up these stories, and that's a huge problem," Dr Sheppard says.

Of the pollsters publishing national opinion polls, Essential, YouGov Galaxy, and Ipsos are members.

Resolve and Roy Morgan are not.

The mere fact that pollsters are more prepared to talk about their methods publicly is a big step forward.

Although Mr Pennay says it could go further.

"For the pollsters to say, which some of them do, that they have a secret sauce in their weighting ingredients and things like that, it actually does a disservice," he says. "It sounds like a thumb on the scale."

"It's important that people have faith in how they're conducted, and the way to establish that faith is to have a high level of transparency."

So, can we trust polls this time?

There's been a lot of soul-searching, but the big question remains: will the polls be closer to the mark this time around?

Ultimately we won't know until election night, but there are a couple of encouraging signs.

For starters, the limited polling that was published during the recent South Australian campaign seemed to pick the result relatively accurately.

The final Newspoll of the campaign, published just before polling day, missed the two-party preferred by less than one percentage points.

A different poll by the same organisation in the final week of the campaign, which did not weight its sample by education level, had a bigger error.

Dr Sheppard says she's much more optimistic about the polling industry now than she was in the aftermath of 2019. 

"I think we have to get more used to polling misses, we have to probably recalibrate the way that we think about polling misses," she says.

"If you are still within an accurate prediction of 5 or 10 seats, that's a pretty good outcome."

"To run a really good poll in Australia costs upwards of $100,000, and not many organisations in Australia have that kind of money.

"The polls that you see are often what pollsters will provide to a media company for as little cost as possible, and they're basically advertising campaigns on behalf of that polling company."

And even though polling is getting harder and harder to do well, it's still the best method we've got for understanding public opinion.

"It's almost a bit like democracy itself," Peter Lewis says. "It's the worst way of measuring except for everything else."

"I'd much rather be building insights on this, as opposed to the betting markets or a social media read.

"But yeah, there is no certainty."

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.