The public opinion polling industry appears to be broken. As the votes continue to be counted in several battleground states, the partial results tell a different story than public opinion polls in the final weeks before Election Day, which incorrectly foretold a near-landslide victory for the former vice president and a “blue wave” sweeping the Democratic Party into control of the Senate and an increased majority on the House.
But from all appearances, the internal surveys of both presidential campaigns, which remain hidden from the American public, directed the candidates to travel to and spend millions of dollars in states like Michigan, which one poll said was a lock for Biden, and to stay away from states like Texas that public polls said were up for grabs.
Robert Cahaly, senior strategist and pollster at The Trafalgar Group, one of the few polling organization to predict a Trump victory in 2016, said that, although he doesn’t have an insight on the results of internal campaign polls, he knows that the problem lies with organizations that do the polls the public hears about.
Cahaly told Newsweek that most survey organizations are “VCR pollsters in the Netflix age that are just doing things that are backwards.”
He said polling organizations haven’t changed their methods to match changes in society.
“This idea that you’re going to call somebody at home at 7 o’clock at night and they’re going to stop what they’re doing, stop washing dishes, stop putting children to bed and talk to somebody to take a poll for 15 or 20 minutes, with 25, 30, 45 questions, it’s insane,” he said. “It’s not happening. You cannot capture average Americans doing that. Nobody lives that way anymore. It’s too fast-paced a life.”
Trafalgar has been successful, Cahaly said, because his organization built questionnaires that last 3-5 minutes with 7-9 questions, and accompanying models that incorporate a “fingerprint” of voters with 57 characteristics.
He said that other confounding elements keep most pollsters from getting Trump voters talking.
Trump is a candidate “who’s smack in the middle of the social desirability bias,” Cahaly told Newsweek. He described that bias as the tendency of a person who’s being asked a poll question fearing a negative judgment of their true opinion and trying to avoid that judgment by answering the question in a way that they think makes them look better in the eyes of the person asking the question. Put simply, people tend to tell pollsters what they want to hear.
Some Trump supporters likely gave pollsters answers that weren’t necessarily true, while some simply declined to participate in surveys.
For those who watch these races closely, there is another question: Why do the private polls done for the candidates differ so much from the polls that are available to the public?
The disparity between campaigns internal private polls and other surveys proliferating in the news and available to the public during presidential campaigns has contributed to increasing public distrust of election polls, a crisis of confidence.
According to an October 2019 Hill TV/Harris poll, 34% of respondents said they believed most results of political polls but don’t trust certain sources. Another 32% said they didn’t believe most results but trust certain sources. An additional 22% said they almost never believe poll results, compared to 11 percent who said they almost always believe them.
For evidence of the failures of the polling industry to accurately predict the outcomes in many places and races, look no further than a survey of likely Michigan and Wisconsin voters conducted October 20-25. An ABC News/Washington Post poll done by Langer Research Associates found Biden with a 17% lead over Donald Trump in Wisconsin. As of Friday morning, Biden was projected to be the victor in Wisconsin with a lead of less than 1% of the counted votes.
In that same Langer Research Associates poll done for ABC News and the Washington Post, Biden had a 7% lead in Michigan, where it was projected as of Friday morning that he had won by 2.7%, with 99% of the votes reported.
What did the campaigns know that the rest of us didn’t? Through October, and into November, Trump and Biden continued to visit Wisconsin and Michigan.
Former newsman and former director of polling at ABC News Gary Langer, president of Langer Research Associates, whose company designed and performed the deficient Michigan and Wisconsin poll, explained his perspective on the survey’s results.
Langer said that a “pre-election poll should not be considered predictive unless it’s conducted in the very final few days before Election Day.”
Langer Research’s Michigan and Wisconsin polls were conducted October 20-25, eight to 13 days away from election eve, Langer told Newsweek in an email. But an Insider Advantage/Center for American Greatness poll of Michigan voters conducted October 30-31 found Biden had a 2% lead, much closer to the 2.7% lead in votes counted as of Friday morning.
Both Insider Advantage and Center for American Greatness didn’t respond to requests for comment.
Another Langer Research poll in Florida, conducted October 24-29, was still somewhat distant from Election Day in terms of relevant predictive value, Langer said, and found Trump ahead of Biden by 2%. According to unofficial Florida results Friday morning, with 99% of the vote reported, Trump, ahead by 3.3%, was declared the winner.
Following the debacle of the 2016 election, when nearly every public opinion poll got the election wrong, pollsters assured the public that they had made adjustments so that the errors would not be repeated in 2020. But clearly errors were made, even if they weren’t the same errors that were made in 2016.
So how did the pollsters get it so miserably wrong again?
“We know the polls were off. We’re just not sure by how much, and we won’t know that until all the vote is counted, including in places where it doesn’t matter like New York and California,” Patrick Murray, director at the Monmouth Polling Institute told Newsweek. “It’s going to take us some time to figure out what actually happened here.”
Pollsters were evidentially confounded by the unprecedented early voting situation, Murray said, and survey organizations need the final election results and the raw data to ascertain how many mail-in ballots were rejected or never received before a full explanation of what went wrong can be produced.
FiveThirtyEight, a highly respected aggregator of polls and data journalism site, listed the results of more than 300 national presidential polls released in October. Not all those pollsters got it wrong.
An Emerson College poll of likely North Carolina voters conducted October 29-31 was nearly spot-on. The poll found Trump and Biden in a dead heat, tied with 47% each. As of Friday morning, Trump led by 1.4% in North Carolina.
Emerson’s statewide presidential poll in Maine was close to the mark as well, finding Biden leading Trump by 9%, 54% to 43%. Friday morning’s unofficial count showed Biden leading the president by 8.7%, 52.9% to 44.2%.
“The poll was weighted to reflect the turnout for the presidential election,” Kimball said in an email.
But Emerson was among the pollsters that got it wrong in the U.S. Senate contest in Maine. Incumbent Republican Susan Collins beat her Democratic challenger Sara Gideon by an estimated 8.8%, according to unofficial results as of Friday morning.
All 11 polls released in October aggregated by FiveThirtyEight gave Gideon a lead ranging from 1% to 8%. In like fashion, in RCP’s list of 14 polls since February, Gideon had leads ranging from 1% to 12%. An Emerson poll of 823 likely voters conducted October 29-31 found Gideon leading Collins by 6%.
Spencer H. Kimball, director of Emerson College Polling, said the poll fell out of the 3.3% margin of error, the measure of statistical accuracy, making it an outlier because the majority of the 4% undecided in their poll – 97% of them – likely cast their ballots for Collins. In addition he said that had they known the way undecided voters would break for the Republican senator, the poll’s results would have been much closer to predicting the race’s outcome.
Pollsters were evidentially confounded by the unprecedented early voting situation, Murray said, and he told Newsweek that survey organizations need the final election results and the raw data to ascertain how many mail-in ballots were rejected or never received before a full explanation of what went wrong can be produced.
He summed up the polling process in two sentences.
“Polls are measuring voters who say they voted,” Murray said. “We have no way to determine if the vote was actually counted until after the fact.”