In polling on questions of public policy, is it better to look at samples of all adults or likely voters? The answer may not be obvious, but understanding the difference between the two helps explain a big source of divergence in polling.
In the context of a pre-election poll aimed at predicting the outcome of a campaign (or, if you prefer, understanding the dynamics of an election), the answer is obvious: To paraphrase Republican pollster Glen Bolger, likely voters are the Americans who are, um, likely to vote. Pollsters may disagree on the mechanics of identifying the likely electorate, and about the accuracy of such selections many months before an election, but the theory is sound. If you care about an electorate, you care about "likely voters."
But if your aim is to measure public opinion with respect to issues of public policy, the answer is not quite as obvious. Should elected officials pay heed to the needs and desires of all their constituents, or just the ones who vote? And putting aside the difficulty of identifying those who actually vote, if we choose to ignore nonvoters, which "electorate" should we sample? The 131 million who voted for president in 2008? The 86 million who voted in the 2006 midterm elections? The unknown number who will choose to vote this fall?
We often see polls report results for "likely voters," and will certainly see far more over the next six months, but the underappreciated reality is that no two pollsters define the likely electorate the same way.
The task of narrowing from all adults to those deemed most likely to vote usually has the effect of making the resulting sample older, wealthier and more white, since younger, minority and downscale Americans turn out to vote less often, especially in midterm elections. Since those demographic groups are also the most supportive of President Obama and the Democrats, the shift to a likely voter sample usually makes the results better for Republicans.
This pattern was most evident in polling on health care reform. Consider, for example, a Kaiser Family Foundation survey conducted a month ago, about a week before the House of Representatives passed the reform bill into law. Among all adults, 46 percent said they "generally support" the "the health care proposals being discussed in Congress" and 42 percent said they "generally oppose." But as the following chart shows, the numbers flipped among the 80 percent who said they were registered to vote: 43 percent favor, 46 percent oppose.
The Kaiser poll also included a question about past voting in midterm elections, and when we used that to narrow the sample to the 41 percent who said they "always vote," opposition to health reform increased to 51 percent and support fell to 41 percent.
Some might quarrel with my choice of this particular poll, since Kaiser found less opposition to health reform than other surveys of adults fielded at about the same time. What interests me most in this case, however, is the clear pattern it shows by registration and past voting: The more likely respondents are to vote, the less likely they are to support health care reform.
While the comparisons are not as clean given the wide variety of different questions asked about health reform, this pattern helps explain why surveys that sample only "likely voters," like the Rasmussen Reports automated surveys, typically show more opposition to health reform than surveys of adults.
With health reform, the issue is not just the demographics of voting, but also of insurance. Consider the same numbers from a different perspective: The adults who told the Kaiser interviewers that they are not registered to vote support health reform by a more than 2-to-1 margin (56 percent to 26 percent). That's probably because 38 percent of non-registrants also said they are not covered by insurance, compared to just 15 percent of registrants.
Which brings us back to the question of political philosophy: Which "public opinion" should elected officials watch most closely, that of all Americans or just those most likely to vote in the next election?
That debate is far bigger than this column, but I can offer one observation: Judged by their rhetoric, politicians from both parties pay far more lip service to the needs of all Americans than those most likely to vote.
I went back to the transcript of the televised, bipartisan health care summit held in February. During seven hours of discussions, the president and congressional leaders used the phrase "the American people" exactly 52 times, and more than two-thirds were explicit references to public opinion: what "the American people" want, say or believe.
Only once did anyone use the word "voter," and that was President Obama: "I don't need a poll to know that most of Republican voters are opposed to this bill."
I tried the same thing with the Congressional Record for the final round of debate in March, just before the House passed the health reform bill, and found the same result: House members referred to "the American people" 59 times, but "voter" or "voters" only twice.
Think about that the next time you see a likely voter survey used as evidence of what "the American people" want.