Skip to Content

Who's a 'likely voter'? That's a tricky question for pollsters this year

Pollsters are struggling to figure out who's likely to vote for these men.
REUTERS/Jim Bourg
Pollsters are struggling to figure out who's likely to vote for these men.

Another day, another poll.

That's the first sentence of a recent story on the Star Tribune's website, and it's just as true as it is trite.

Tuesday's news brought a Quinnipiac poll that showed Obama ahead of McCain by 2 points. Rasmussen reported the day before on a one-day poll that Obama was ahead by 8 in Minnesota. A "Big Ten Battleground Poll" conducted last week also showed Obama ahead by 2. The Strib's Minnesota Poll recently showed a dead heat. Who's right? Or are they all right, since the polls were taken at different times?


By election time, those results will be old news, but any researcher worth his or her salt will tell you measuring candidate support accurately in a pre-election poll is tough. And this year may be the toughest for pollsters that it's been in many decades. What makes it tougher this time?

First, it's a unique election year. This is the first election since 1928 when there has not been an incumbent president or vice president running for election. Another unique trait: One candidate is white and another black. One is younger and the other much older. Then add in the potential of a massive, perhaps record-breaking turnout. Voila — a unique challenge in figuring out who will vote and how to measure support.

Pollsters use different "likely voter" models and those who have been around the block a few times keep track of how their models perform in various elections.  They use some for high-turnout elections; some for low. Some use screens to eliminate unlikely voters. Some weight all respondents, counting likely voters' responses more and those less likely to vote less. There's no industry standard "right way" to model a likely electorate, and virtually all pollsters have their favorite method.

The Pew Center's Research Director Scott Keeter has this take on it: "We are analyzing and tinkering with our likely voter scale in an effort to make sure that it is appropriately sensitive to the high level of enthusiasm for Obama's campaign among young people and African-Americans. … But we know that voter turnout among young people rose a lot from 2000 to 2004, and might rise again this year if the primaries are any indication."

The MPR-Humphrey Institute Poll also is tinkering with its voter model to reflect the increased interest in this election, according to Joanne Miller, associate professor of political science who works with the poll.

But will theirs and the others hold up in this unique, high-turnout election?
 
And how are pollsters meeting other challenges that range from record levels of cell-phone-only households to low response rates?

Cell phones
Will cell phone-only households be the pollsters' Achilles' heel this election? No one knows — yet — but academic research and new evidence from the ABC News and Pew Center polls have some hints. The most recent federal government estimates say about one in six households are cell-phone-only households — CPOs in pollsters' lingua franca.

Many pollsters still use only land-line samples as they've done successfully for decades.

The Pew Center, Gallup and ABC News have begun including cell-phones in their samples. That's a good thing, many think, but it's expensive: Recent estimates find that it costs about three times as much to conduct cell-phone interviews than traditional land-line interviews.

In the past week, the two organizations have released analyses looking at polling with and without cell phones. One thing we know from the federal research: Those in CPOs are much younger, on average.

And ABC and Pew agree on one thing: Differences are slight, but there's likely an Obama edge when cell phones are included in the sample.

Race
It's not just the political situation that's unique in the presidential campaign. The racial composition of the candidates is something researchers can't ignore.

"We are keeping an eye on possible race-of-interviewer effects in an effort to gauge whether there is any hidden racial bias in our polls," the Pew Center's Keeter said.

Race-of-interviewer effect?

That's where some poll respondents of one race answer questions on the phone one way when posed by interviewers they perceive to be of a different race, and another way when posed by interviewers they perceive to be their own race. Keeter saw that strongly in the 1989 Virginia gubernatorial race with Doug Wilder. There also was interviewer effect in the 2001 Minneapolis mayoral election polling conducted by the Star Tribune when black incumbent Sharon Sayles Belton lost to white challenger R.T. Rybak.

Nowadays it's just called the Bradley effect after a 1982 California election. Keeter says it is difficult to know whether racial attitudes will contribute to a potential underestimate of support for McCain.

Response rates
Response rates, one of many indicators researchers look at to discern whether they have a good sample, continue to be low after dropping for decades. (Thanks, Arianna Huffington, who told voters to hang up on pollsters.)

For decades, researchers have been worried about non-response effect — the bias that's introduced into polls when those in the sample should have been interviewed but weren't.

Why such low response rates nowadays?

Busy lifestyles are part of the problems. But perhaps more importantly technology has blossomed and computers are used to poll, sell, fundraise and push political candidates into the home through the phone lines. Telemarketers — including unethical political telemarketing or "push polling" — continue to poison the polling well by trying to sell under the guise of research or spread negative information about candidates. Many irritated voters now lump research, political and sales calls into one category — junk to be screened out with answering machines and caller ID.

The good news for pollsters is that recent research suggests that political measures are less sensitive to differing response rates than some other measures.

Still, most competent researchers don't ignore response rates, and try to take every precaution to keep them has high as possible. Miller, the MPR-Humphrey Institute pollster, said they're paying extra attention to response rates this year.

Internet polling
Some pollsters — Zogby is one — have been experimenting with using Internet panels to do pre-election polling. So far in Minnesota and nationally, Internet samples have proven less accurate than traditional telephone surveys.

For example, in 2004 the National Council on Public Polls reports that Internet polls had the worst record of any mode, and national media polls using traditional methods quite accurate. In Minnesota that year, the Star Tribune's Minnesota Poll, the most accurate poll in the state, was only a half-point off using traditional land-line telephone polling and was one of the most accurate in the nation.

Have pollsters using Internet samples been able to find the statistical silver bullet since 2004? They've had four years to sort things out.

Early voting
By Election Day, some estimate that a third of the electorate already will have voted. Are pollsters measuring them now?

Some pollsters ask about early voting and take it into account when they tabulate their results. The national exit poll consortium does pre-election polling just so they can fold early voting into their results and projections. If pollsters in high-turnout states aren't taking early voters into account, they may be setting themselves up for bias if early voters are any different than Election Day voters.

Despite all of these challenges, some pollsters aren't doing anything different. SurveyUSA polling guru Jay Leve, who polls in Minnesota for KSTP-TV, said he hasn't made any "precipitous changes" in methodology this year in the vein of not wanting to fix something that isn't broken.

Will they overcome the current challenges?

Pollsters say their profession has had a history of overcoming obstacles. Gallup and others overcame the problems of the "Dewey Beats Truman" 1948 election.

Then the "technology" criticism raised its head four decades ago when interviewers began telephone polling instead of conducting surveys face-to-face. Pollsters figured out how to make that work.

Rob Daves is principal at Daves & Associates Research in Minneapolis, and teaches survey research at the University of Minnesota's Humphrey Institute. He is a past president of the American Association for Public Opinion Research and former director of the Minnesota Poll.

Get MinnPost's top stories in your inbox

Related Tags:

Comments (2)

Rob,
This is an outstanding piece of public journalism that deserves wide circulation. Even well-informed citizens tend not to use their interpretative powers to put the polls headlines into context.
Thanks for providing readers with a handy little primer.
Monte

Over the last week or so, the sites electoral-vote.com and 538 have mentioned the Selzer polls & how her polling is different from the big guys. The Selzer polls have a reputation for being very accurate, allegedly due to a superior methodology for properly identifying & weighting the youth vote. It might be interesting to compare/contrast various pollsters' methodologies - if you can get them to provide some details of their voter models.