74.9 F
Cambridge
Sunday, July 7, 2024

Poll at Your Peril

This column first appeared in the Harvard Independent.
In the weeks before Election Day, we were besieged by polling data, breathlessly conveyed as breaking news by unimaginative journalists. This might seem rather benign, a mild diversion for political obsessives. But I’m not sure polls are quite so innocent. We either need to train a more critical eye on opinion polls and become informed consumers of their data, or start ignoring them altogether.
The problem is that journalists, pundits, politicians, interest groups, and citizens usually take polls at face value—sometimes because it’s in their interests to do so. If a poll says 55% support Jones, then Jones must have 55% support. QED.
But it isn’t so simple. For one thing, it’s usually a bad idea to draw firm conclusions from a single poll; they can go grievously wrong. Consider a Rasmussen poll that predicted a 13-point victory for Sen. Daniel Inouye (D-HI) a few weeks before the election. (Inouye won on Tuesday by over 50 points.) Then there’s the September poll conducted for PJTV, a right-wing Internet TV channel, which found that a third of African American likely voters would support a Tea Party candidate. A single poll like this one, mixed in with some wishful thinking and self-interest, can outweigh for some people the well-known realities of American politics.
The reason polls have this power is that they have a scientific ring to them. But what makes the best of them scientific often doesn’t apply to run-of-the-mill political polls. I’m not referring only to baldly partisan polling, which the two parties churn out in order to drive their preferred narratives about certain races. (Rasmussen, for its part, is a technically nonpartisan but Republican-friendly pollster; Nate Silver, of the FiveThirtyEight blog, estimates that their polls had a three or four point Republican tilt this year.) There are factors besides a poll’s provenance that should make us suspicious of seemingly straightforward results.
One major problem in the average political survey is forced choice: Pollsters will only offer options like agree or disagree, support or oppose, but not “I don’t know” or “I haven’t thought much about it.” Another recent PJTV poll asked likely voters whether they supported or opposed the Tea Party; there were no other options. The not-so-surprising result was that more than half the country supports the Tea Party. But this finding starts to look a little shaky when you consider that a recent Newsweek poll found that more than a quarter of registered voters have not read, heard, or seen anything about the Tea Party. Apparently a lot of the people PJTV was picking up either as supporters or opponents were just hearing about this “Tea Party” thing for the first time.
Americans just aren’t as opinionated as opinion polls assume and require them to be. When polls explicitly offer an option like “I don’t know” or “I haven’t thought much about it,” people often take it. In 2002, the National Election Study found that about a third of Americans admitted not having thought much about the Bush tax cuts, the central domestic policy initiative of the past year. Or consider a CBS News poll from late August, during the “Ground Zero mosque” nonsense, which asked about Americans’ impressions of Islam. Thirty-seven percent reported not having heard enough about the religion to say.
It’s easy for those who are afflicted with the political bug to forget that their fellow citizens just don’t pay that much attention to politics and don’t have strong opinions on every issue. But opinion polling has often implied the opposite: that Americans have well-formed views on everything under the sun.
Then there’s the issue of question wording, which has a tremendous ability to introduce bias into poll results. Compare a couple of polls taken in June, during the Gulf oil spill, measuring attitudes towards offshore drilling. CBS News asked whether respondents favored increased drilling off the coast, or thought “the costs and risks are too great.” Just 40% favored drilling, and 51% said the costs are too great. But Ipsos presented two options: either offshore drilling is “necessary so that America can produce its own energy,” or it’s a bad idea “because of the risks to the environment.” 62% said drilling is necessary; just 32% said it’s a bad idea.
CBS News, of course, reported that a “majority now opposes more offshore drilling.” Ipsos, meanwhile, concluded that support for offshore drilling wasn’t budging “despite increased coverage and environmental fallout from the spill.” Which conclusion you believe depends on which question wording you prefer—or, more realistically, which conclusion you want to peddle. The inevitable variation between polls with different question wordings enables almost all interested parties to claim the public’s support for their positions.
Now, while I have focused on issue polling, it’s worth noting that polling averages are generally pretty good at predicting electoral outcomes. But a few major caveats are in order. First, the media often doesn’t report polling averages. They report lone poll results, like Gallup’s outlandish prediction that Republicans would best Democrats by 15 points in the overall congressional ballot (it looks to be closer to seven points).
Second, even polling averages can systematically fail. They predicted about a three-point victory for Sharron Angle over Sen. Harry Reid (D-NV), but Reid has pulled another rabbit out of his hat, and won by five points. One possible explanation, says Silver, is that pollsters didn’t pick up a lot of unenthusiastic Reid voters (is there any other kind?) because such people were unlikely to complete the polls. The flip-side of this problem can be seen in Colorado, where polls thought radical-right independent candidate Tom Tancredo would be closer than he was, possibly because his voters were very enthusiastic and likely to respond to pollsters.
I should clarify, in closing, that this is not an argument against polling per se. Nonpartisan, transparent, methodologically sound polling is absolutely useful. It can tell politicians what people think, and tell citizens what their neighbors think. And it is fun for the political class to follow this stuff, and, yes, I confess that it’s fun for me too. But the current cacophony of opinion polls is distracting, and the indiscriminate way in which the media reports on them is misleading. So let’s either learn how to read polls, or how to ignore them.
Photo credit: Flickr stream of tychay

- Advertisement -
- Advertisement -

Latest Articles

Popular Articles

- Advertisement -

More From The Author