More Loaded Dice
The obvious way to rig the results of such a poll is to select questions where the answer you consider mistaken is more popular with one side than the other. Most people who believe Obama was not born in the U.S. are on the right. Most people who believe the Chamber of Commerce used foreign money to influence the most recent election are on the left. By my count, for at least seven of the eleven questions the answer that the study's authors considered misinformed was a view more popular with the right than the left. One—the Chamber of Commerce question—went the other way.
A second problem with the study was that, for at least three of its eleven questions (whether stimulus had saved several million jobs, whether the economy was recovering, whether Obamacare increased the deficit), the right answer was unclear. In none of the three did the study's authors provide adequate support for their view—which, in each case, coincided with the claims of the Administration.
I first heard of the study via a critical piece on Reason's blog. A while later, I came across another reference to it, a Usenet post by someone who obviously approved of its conclusions. I responded and pointed out the problems.
With regard to the three questions where the study's answer was less obviously correct than its authors thought, I can easily imagine a reasonable person disagreeing with me, arguing that the study at worse mildly exaggerated how clear the right answer was. I do not, however, see how any reasonable person could fail to see the way in which the selection of questions was biased, once it was pointed out.
I am now waiting to see if there is anyone reading that particular Usenet thread who is willing to admit that the evidence for a conclusion he likes is bogus.