If a professional organization of survey researchers reprimands a non-member for failure to disclose some obscure but essential details of its polling methods, but waits over a year and a half before doing so, does anyone notice... or care? In this case, the answer may surprise.
Last week, the American Association for Public Opinion Research "raised objections" over the failure of Strategic Vision LLC to disclose "essential facts" about the surveys it conducted in New Hampshire and Wisconsin before their 2008 presidential primary elections. AAPOR had conducted a lengthy investigation into the snafus that beset polls in New Hampshire and three other states. While AAPOR came to no conclusions about the accuracy of Strategic Vision's polling, it chose to speak out about the company because it was the only one of 21 organizations that, after more than a year of prodding, had failed to disclose some pertinent facts about its methods.
(Interests disclosed: I'm an active AAPOR member and served on the AAPOR's Executive Council from 2006 to 2008.)
AAPOR has long maintained a Code of Professional Standards and Ethics spelling out the "obligations" that "good professional practice" imposes "upon all public opinion researchers" for what they should disclose about their methods. Public censures are rare. It has issued only two such edicts in the last 12 years, and those have caused barely a ripple of attention beyond the survey world.
Given that history, I was not expecting much excitement after AAPOR's announcement last week, but then David Johnson, the CEO of Strategic Vision, started to speak out. He told James Verrinder of Research magazine on Thursday that he "completely disagreed" with the AAPOR charge and that he had supplied AAPOR with all it had requested. He alleged that a competitor was behind the complaint, charged that AAPOR had acted "maliciously" and had issued its ruling "to coincide with the results of a poll we had out yesterday." He also promised to take legal action and file charges "against AAPOR specifically and individual members of AAPOR personally."
The AAPOR announcement also got the attention of blogger Nate Silver, who did some number-crunching and found a possibility of something more sinister. While hedging a bit about "circumstantial evidence," he suggested that the pattern of trailing digits on results from previously released Strategic Vision polls "suggest, perhaps strongly, the possibility of fraud, although they certainly do not prove it and further investigation will be required" (emphasis his).
So perhaps the fireworks are just beginning.
But let's step back a bit, because the most important story is still about the lack of transparency and its consequences.
For all of Johnson's bluster, some basic facts do not appear to be in dispute and add some important perspective:
• On March 4, 2008, AAPOR sent a six-page request to 21 organizations that had released public polls in four 2008 primary states which had seen especially big polling errors or wide variation in results. They included every pollster that had produced public polling in the last two weeks before each of the four primaries (eight of these were AAPOR members).
• Strategic Vision's public reports already included much of the requested information, but they did not include other items on the AAPOR's disclosure checklist: the name of the survey sponsor, a description of the sample design (how respondents were selected) and "sampling frame" (e.g., whether telephone numbers were selected from a list or by generating random digits), and the response rate or statistics necessary to calculate the response rate.
• While much dispute remains over whether anyone at Strategic Vision ever saw the AAPOR requests during 2008 (AAPOR says it sent two letters by Federal Express), the organization only initiated a formal complaint after releasing its report this past April.
• Both sides appear to agree that Johnson did provide some additional details in June ("information about the survey sponsor, the organization that conducted the survey and the sampling frame," according to an e-mail Johnson shared with the Atlanta Journal-Constitution). However, AAPOR claims Johnson continued to withhold information about response rates and weighting procedures.
What might put all of this in some perspective is an example of the disclosure that AAPOR did find sufficient from another pollster regarding response rates and weighting. In its response, the firm Public Policy Polling said only that its survey of Democrats in South Carolina called "15,000 likely voters" and "808 responded in full." The survey was also "weighted for gender using random deletion and statistically weighted for race."
That was it. And that's what this dispute boils down to, according to AAPOR. Rather than share a sentence or two of explanation about weighting procedures and a set of response rates (or call-disposition statistics) -- something done in one form or another by the other 20 organizations involved -- Johnson has chosen to stonewall for more than a year and now threatens legal action.
Which brings us back to Nate Silver's somewhat hedged accusation. As of this writing many questions remain about the meaning of the patterns Silver reports, and I would judge his evidence as far too tentative to support an allegation so sinister as outright fraud. Even under ideal circumstances this sort of analysis is almost always suggestive, rarely if ever conclusive.
But it raises an important question: Setting aside Strategic Vision for a moment, how do we ever know that a pollster isn't just "making up the numbers?" The short answer is, we can't. In an old media world, we received polling data from a few trusted media brands. If CBS News and the New York Times said they did a poll, then we trusted that they did a poll. But in the new media world, we are confronted with polls from sources we have barely heard of, including some organizations that appear to exist solely for the purpose of disseminating polls.
That is why the sort of transparency and scientific integrity embodied in AAPOR's ethics code is more important than ever. "Transparency," argues technologist David Weinberger, "is the new objectivity." When it comes to surveys, it may be the path to trustworthiness as well.