Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Rolling Stone
Rolling Stone
Politics
Ryan Bort

What Pollsters Want You to Know About What Went Wrong

“I don’t want you to lose your job, but why should you exist?”

This is what Siena College Research Institute Director Don Levy says a reporter asked him earlier this month after it became clear that he and his fellow pollsters drastically underestimated Donald Trump’s support, four years after making the same mistake ahead of the 2016 election.

Levy seemed gobsmacked by the question, but it wasn’t out of step with much of the media’s reaction to the polling industry’s whiff. “Polling seems to be irrevocably broken,” Margaret Sullivan wrote the morning after Election Day in The Washington Post. “The polling industry is a wreck and needs to be blown up,” added Jake Sherman in Politico. A day later, the headline of a Nick Bilton piece in Vanity Fair described a “polling meltdown.” The New York Times wondered if 2020 had “killed” the polling industry. Jacobin didn’t need to ask: “Polling is dead.”

Public opinion research is a thankless job in the best of times. There are no congratulations for charting the nation’s changing preferences toward a wide range of issues, or for basically predicting the future, as pollsters did, for the most part, prior to the 2018 midterms. But there’s no shortage of scorn if the polls turn out to be less accurate than usual and the media needs a scapegoat for the potential downfall of democracy, as was the case this year.

For pollsters, the fallout from the 2020 misfire marks a painful return to a place they’d spent the past four years trying to avoid: scratching their heads over what went wrong while trying to emphasize the uncertainty inherent in charting horse-race elections. But Americans who have been wracked with stress over the nation’s future don’t have a ton of patience for the nuances of probability theory. “You can’t talk to the public that way,” says Levy. “It’s totally unsatisfying.” 

Levy and the other researchers Rolling Stone spoke to believe they deserve a little more slack than they’ve been given, both in response to 2016 and now 2020. This is probably true, but at the same time it’s hard to deny the problem has gotten worse, and concerns over what went wrong, why pollsters were apparently unable to fix their mistakes, and if it’s possible to trust them again as the political landscape continues to contort beyond recognition are plenty legitimate.

Insulting as it is for pollsters to be asked “why should you exist?,” Americans who feel like they’ve been duped would like an answer.

POLLSTERS HAVE SIMULTANEOUSLY admitted that something went wrong on a systemic level, while arguing their miss wasn’t as bad as the initial backlash implied. The polls correctly predicted the winner of the election, as well as the winner of 48 of 50 states (Florida and North Carolina were the two misses). They had Biden winning the popular vote by an average of around 8.5 points, and when the dust settles he’ll have won by 4 or 5 points. “The polling narrative developed relatively quickly,” says Nick Gourevitch of Global Strategy Group, a Democratic polling firm. “I think if we all woke up today and looked at the map and did not have the experience of [election week], the reaction might have been slightly different. I think it was somewhat colored by the way in which we all experienced the results.”

But it was still bad. Just look at Wisconsin. In the run-up to Election Day, several major pollsters had Biden leading the state by around 10 points. An ABC/Washington Post poll, as Trump has gleefully pointed out on Twitter, had Biden up 17 points. The president-elect won the state by less than a point.

The Wisconsin miss was especially frustrating because it was also the most notable polling error in 2016. Hillary Clinton famously stopped campaigning there under the assumption she was so far ahead she couldn’t lose. But she did, narrowly, sealing the presidency for Trump. The terrifying sense of deja vu that set in as Trump led Biden in the state throughout Election Day was responsible for an untold number of armchair expletives hurled at pollsters. Just when everyone thought it might finally be safe to trust them again, they’d blown it — again.

Wisconsin was far from the only misfire. The polls indicated Biden was on track to win big in Arizona, Michigan, and Pennsylvania. He wasn’t. The margins were 3-5 points slimmer than expected in all three states. They said Florida and North Carolina leaned his way. They didn’t. He lost both of them. They said Iowa, Ohio, and Texas would be close. They weren’t. He lost them by 8, 6, and 6 points, respectively.

The polling industry took it on the chin for underestimating Trump’s support in 2016. How could it possibly have underestimated it to an even greater degree four years later?

Pollsters have floated several preliminary theories for what might have happened, the most common being something called nonresponse bias, or the idea that Trump supporters were less likely to engage with pollsters than Biden supporters. It’s a reasonable assumption, considering Trump has had four years to pound a fanatical belief into his supporters that the media is the enemy and the polls are fake.

“It has to be that because the error was consistent across all of us,” says Levy, who recalls some of his pollsters getting hung up on by people who say they support Trump. “The other theory is that we called each other up and said we’re going to bias in favor of Biden over Trump. That doesn’t happen.”

Some have used nonresponse bias to bolster the idea that the issue isn’t so much a flaw in polling methodology as there’s one in name at the top of the ticket. “If you take Trump out of the equation, polling is still working,” says Patrick Murray, director of the Monmouth University Polling Institute. “It’s why I feel differently than I did four years ago, when I was like, ‘How did this happen? What happened to the polling industry?’ We looked at it, made some changes in 2018, and said, ‘OK, we’re fine.’”

But not everyone is convinced the problem begins and ends with Trump. Pollster Ann Selzer, the self-proclaimed “outlier queen” who correctly predicted Trump’s margin of victory in Iowa, calls it a “handy explanation.” Barbara Carvalho, director of the Marist Poll at Marist College, agrees. “I think that’s a cop-out for pollsters,” she says. “It suggests we did everything right. I don’t really adhere to that.”

Many have also cited the record number of new voters as a complicating factor. (“I was like, ‘Ugh, who are all these new voters?’ Emerson College polling director Spencer Kimball says of the atypical uptick in North Carolina, where pollsters missed by around three points.) Covid-19 could have also played a role. Not only did it inspire an unprecedented surge in early voting, some have suggested it may have skewed polling in Biden’s favor because Democrats may have been more likely to take the pandemic seriously, and thus may have been more likely to be at home to answer the phone. “In the last couple of years we’ve invested a ton in polling through other ways than phones because the response rates were dying,” says Gourevitch. “Then Covid hit and they went up, and we were like, ‘OK, we can get another cycle out of this.’”

It’s also possible that 2020 was just a really, really strange year that presented a bunch of variables pollsters weren’t accustomed to managing. “This election had a number of changes in the electorate itself,” Carvalho says. “How we vote and the number of people who voted. Certainly any of us who have been polling in our careers have never experienced either. In order to deal with it we had to make certain assumptions going into it. I think coming out of it we’ll be able to make better assumptions in the future, and improve the science.”

THE COUNTRY, AND THE WORLD, are changing more quickly than ever, and it’s possible that the country is shifting too quickly for polling science to keep up. As Gourevitch mentioned, response rates are down. So are social and institutional trust. Racial, class, and geographical demographics are shifting rapidly, as is the public’s relationship with the media, which informs their relationship with politics, which informs how they form and hold (or change) their opinions about political candidates.

Selzer says she correctly predicted Trump’s win in Iowa — and has predicted other elections successfully in the past, most notably Barack Obama’s surprise win at the 2008 Iowa caucuses — through what she calls “polling forward,” or taking her findings at face value instead of adjusting it based on historical trends. “The other pollsters have a data-driven model based on all of this analysis and all of these regression equations and they think they have it figured out,” she says. “But there’s a part of me that just says: ‘unless there’s change. How are you factoring in change?’”

Pollsters will be asking themselves these questions in the coming months as they audit their data and consider how to alter their methodology accordingly. “We have to sort out what of it was systematic and that we can do something about,” says Murray, “and what of it was simply that every election is different and the polls could be off a couple of points in either direction because of that and there’s nothing we can do other than try to convey this level of uncertainty that’s inherent in election polling.”

This is what happened after the 2016 election, when the industry — loosely organized under the American Association for Public Opinion Research — determined the miss was due largely to failures to account for Americans without college degrees and the undecided voters who broke late for Trump. When pollsters retroactively adjusted their models to account for these issues, they spit out the correct results. They then performed far better predicting the 2018 midterms. All was well. Then it wasn’t.

Murray and others may point the finger at Trump, but the American political landscape isn’t going to revert back to where it was 10 years ago just because he’s no longer on the ballot. As it is with so much of what America has gone through over the past four years, Trump is a symptom of a disease. A Gallup poll released in September found that, though Republican trust in mass media has plummeted since Trump rebranded criticism and accountability as “Fake News,” it was trending downward for years before he took office. The pollsters may have done well in 2018 when he wasn’t on the ballot, but they still underestimated Republican support

Institutional trust among conservatives doesn’t seem like it’s going to bounce back anytime soon. The United States is also diversifying faster than predicted, and fewer Americans are going to college. Meanwhile, all of these people are voting in record numbers. It’s only going to get harder and harder for pollsters to reconcile this divide — if it can be reconciled at all.

POLLSTERS HAVE AT LEAST CLAIMED they’re not bothered by the rallying cries to shutter the industry. “It’s nothing new,” says Mallory Newall, director of polling at Ipsos. “It’s nothing we haven’t seen before.”

“Historically, I feel like we’ve always been a good backup for the media,” Kimball adds of how it’s “kind of our job” to take criticism. “We keep the journalists’ hands clean.”

“It’s always unsatisfying,” Levy says. “In 2018, nobody was calling me up and saying, ‘Hey, Dan, you guys did a tremendous job! It’s amazing that you did 100 polls and got 93 of them right!’ Nobody made that call. Now the narrative is that polling is wrong and ‘why should you exist?’”

Most of the pollsters who spoke to Rolling Stone made a point to stress that election polling is only a fragment of public opinion research, which gauges how the nation feels about topics like cannabis legalization, raising the minimum wage, and health care. These polls provide a sense of how American values are evolving, and are used as the impetus to move legislation. “It’s sort of the same as a reporter writing a gossip piece using unnamed sources versus an in-depth investigative piece,” says Newall. “Just because you don’t like what one says doesn’t mean we should throw out the rest of it.”

This doesn’t mean election polling isn’t important or that there isn’t an imperative to understand what went wrong and how to correct it. It isn’t just a problem of methodology, either. Pollsters need to figure out a better way to present their work so as not to instill in the public a false sense of where the race stands, which could very much affect the outcome of the race. If sometimes-voters think it isn’t going to be close, they’re going to be less inclined to head to the polls.

This also falls on the media, including election forecasters like Nate Silver of 538 (who gave Clinton a 71.4 percent chance of winning in 2016, fueling the narrative that the election was practically a lock) and Nate Cohn of The New York Times (whose forecasting “needles” supercharged panic on Election Day as they shifted in the opposite direction that umpteen polls had promised).

“If you look at medical studies, you need to be able to have enough confidence in your results so that in 95 to 99 percent of the cases, if someone else were to do this again, they would get the same result,” explains Carvalho. “When you’re looking at the forecasters and they say a candidate has a 77 percent chance of winning, that sounds like a really, really big number. But in statistics, it’s not something you would bet the farm on. The odds are a little more in your favor, but not a lot. I think it’s really, really hard to explain that kind of uncertainty.”

Though polling failed in the upper Midwest in 2016, they were off by less than 2 percentage points, nationally. “In 2016, we were spot on,” Carvalho says.

Silver did a better job framing his forecasts this year than in 2016, even writing a pre-election piece stressing that, though 538 gave Biden at 90 percent chance of winning, Trump could most definitely still win the election. But an essay isn’t going to override what people instinctively feel when they see something has a 90 percent chance of happening, which is that it’s practically a lock. The harsh reality of all this is that the nuances of public opinion election polling — the degrees of variation, the complexity of sampling, the margin for error — simply isn’t very compatible with how most people tend to process information, which is with as little nuance as possible. “The world wants more certainty than is possible to give them,” Gourevitch says. “I think sometimes as a pollster it just makes you want to throw your hands up in the air.”

Recontextualizing election polling might wind up being more difficult than correcting election polling methodology. The latter can be done scientifically. But the media tempering its reliance on polls and how they’re framed could be a matter of sacrificing both revenue and their own authority to give definition to horse-race elections that are constantly changing and largely mysterious. “There is uncertainty in polling,” Gourevitch continues. “It’s not magic. It’s not a soothsayer, predict-the-future device. I think it gets presented as such to the detriment of pollsters and the people that consume it.”

Unfortunately, the smart money is probably on everything staying more or less the same. Campaigns, the media, and the public are desperate for the kind of quantitative veneer of direction polling offers. Everyone is lost in the wilderness during a campaign. Polling offers a light. Even if it’s wrong, it’s comforting. “This is supply and demand and there’s a lot of demand for polling,” Selzer says. “As we get closer to an election, the demand is so strong that a lot of people who will pay for it don’t care if it’s all that good. They just want to be in the circle of people who are doing polls.”

It’s hard to tell whether to feel reassured or disconcerted by the pollsters’ even-keeled confidence that they’re going to be able to diagnose and correct what went wrong. But the larger issue of reshaping the popular conception of polls is going to require the pollsters, the media, and the public to take a step back and alter their perspective on every facet of election polls — from methodology to presentation to consumption. Otherwise, we’re going to be right back where we are now in four years — or in a few months.

“We thought we’d never look at polls again after 2016, but here we are in the same frenzy around polling and being disappointed,” Carvalho says. “Rumor has it a lot of folks are already polling Georgia.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.