Skip to Content

Support MinnPost

What's the problem with polls these days? Experts take a stab at answering

Minneapolis public opinion expert Bill Morris was not impressed by the crosstabs on Survey USA polls released over the summer.

In recent months, MinnPost’s David Brauer and I have both posted stories questioning the accuracy of several different public opinion polls by supposedly reliable firms that reported surprising and contradictory numbers.

In June, I reported that the Democratic-leaning Public Policy Polling showed a 10-point shift in opposition to the proposed constitutional amendment banning same-sex marriage. Barack Obama had just voiced his support for gay marriage, so that bump made sense.  

The following month, Brauer and I wrote separate pieces probing two SurveyUSA polls about the proposed amendment, one in May and the other in July, that showed a 20-point swing in the other direction. Perplexed, we each called on different experts who said the numbers seemed odd to them.

In my story, Minneapolis public opinion expert Bill Morris, principal of Decision Resources, said that the crosstabs — the data subsets breaking things down by age, political affiliation and so forth — seemed completely wrong to him. Way too few young people said they’d be voting no, for example, and the standard gender gap was missing.

Brauer’s story quoted the pollster saying he disagreed with the media’s take-away from his survey, and broke down the quirks in its methodology.

Brauer took another detailed look at the topic in mid-September, when the two firms again released numbers: “SUSA has the anti-gay-marriage amendment claiming a majority even without undecideds, 50-43, while PPP posited a nailbiter, 48-47,” he wrote.

A fascinating look at polling chaos

What gives? A recent New York magazine story offers a fascinating, compulsively readable feature on the chaos plaguing polling, “The. Polls. Have. Stopped. Making. Any. Sense.” That headline being the 46-character tweet sent out by polling wunderkind Nate Silver after a post-Democratic National Convention poll showed Obama beating Mitt Romney decisively in scarlet Wisconsin and Romney besting Obama in New Hampshire.

Silver, of course, is the onetime baseball-stat geek whose uncanny knack for calling elections dead-on got his blog, FiveThirtyEight, picked up by the New York Times. He has a new book out, “The Signal and the Noise: Why So Many Predictions Fail.”

The New York piece is truly worth a read. Its gist, which isn’t done justice by condensing here, is that while Americans are increasingly hungry for poll data, fewer and fewer media outlets are paying for polls, leaving the field to pollsters with skin in the game.

'Polling's dark age'

At the same time, conventional polling methodologies are increasingly outdated, to the point where the percent of potential voters contacted who actually agree to answer poll questions may be in the single digits. And without reliable mechanisms for compensating, outcomes are ever more dependent on “weighting,” or the pollster’s statistical adjustment for variables.

To wit:

“The rising demand for trustworthy polling analysis also reflects something disturbing about the data itself. The central problem is that prototypically modern science is being disrupted by new technologies, which have created a flood of new firms and new methods. ‘We’re in sort of what I would call polling’s dark age,’ says Jay Leve, who runs the polling firm Survey USA. ‘We’re coming out of a period of time where everyone agreed about the right way to conduct research, and we’re entering into a time where no one can agree what the right way to conduct research is.’”

And the release of a poll can have an immediate effect on a candidate or campaign, particularly late in the game when undecideds may find it tempting to join what looks to be the winning side.

Will this October’s “surprises” be polls skewed to affect the outcomes of various contests? Stay tuned.

Get MinnPost's top stories in your inbox

Related Tags:

Comments (5)

Psychological warfare

In other words, polls are not actually polling the people for their answers, but selecting people for the answers they'll give. How they're selected and how the questions are asked are directly determined by the results pollsters (and their political employers) want. The polls are then used to sway the public into believing one scenario or another. And it's not just the 'enemies' that are being manipulated, but the 'friends' of the candidates involved. After all, your supporters can be motivated by either being behind or ahead of the polls, depending on the response you need. Psychological warfare...

We love pollls; we hate polls

I think much of the problem is the love-hate relationship we have with polling in general. A good poll relies on people telling the truth. And because the sampling is usually so small, it doesn't take much to screw them up. We all want to read a poll, but many people so dislike polling that they give false answers or don't participate. I have several friends that say they intentionally give false answers just to muck them up. This is their way of creating a microrebellion.

Or, in some cases, I know people who want to make sure voters show up and if the election looks like a landslide - one way or the other - people don't show up at the polls. So, if someone was using this concept and they opposed the marriage amendment, they might say they support it in a poll to make the poll look close.

Also, robocall polling tends not to get people in the swayable middle. When I get a poll, I always regret it. I'm a social liberal Democrat. I know I'm going to vote. I want to know how someone else will vote. But a lot of people who vote do so out of a civic responsibility, but they are so tired of polls, news and ads, they just tune them all out.

the Gallup poll

In the middle of September I was polled by Gallup. It was an odd experience. The pollster began with the usual question "If the election were held today would you vote for...? etc. But then the pollster switched to how satisfied I was. I thought it was a political question and answered that I was basically dissatisfied. Then I faced dozens of questions about my health, whether I had diabetes, cancer, back pain, asthma, joint pain, on and on. Next, the pollster went into my mental health, whether I was sad, upset, worried, depressed, on and on. This became the longest poll I've ever experienced in an election year. Finally, the pollster returned to politics and asked which party I belonged to, how conservative I was etc, and concluded with some demographic type questions. Later, when I saw the results published, there was nothing about all the health information. What was that about?

I wonder if the latest Gallup poll

Has something to do with this piece ' s appearance :

Obama and Romney tied at 47%

Good polls are expensive and there are now technology issues

Polling and survey research is getting more challenging all the time.

To be statistically valid a random sample of the the population must be selected in sufficient numbers to give you a fair confidence interval and confidence level. The key is random sampling within the valid population.

If you can't find some of the population (they have cell phones only for instance), if you don't pick a large enough number, if you introduce non randomness of selection by introducing self selection you get respondent bias unless you get 50% or more of a specifically randomly selected group.

What has been introduced more and more is "normalizing for population" and that is very likely to make your sample non random, because once you "sub select" it is likely you will have too few people in one category to be representative of that group.

Random sampling only works if all members of a population have an equal chance of being selected. If they don't then you may have a non normal distribution and the normal rules of statistical analysis must be modified. It is possible to do that but well beyond most pollsters.

All you can do when you look at a poll is determine if they have introduced any bias into the poll if the have you are likely to have the infamous "Dewey routes Truman" headline.

Good primary data collection is a useful tool but it doesn't happen often.

Is suspect that the Gallup poll combined questions much like the Minnesota Center for Survey Research did on it's annual survey.