Several years ago I had an exchange on this blog with Professor Robert Altemeyer over his claim that authoritarianism was more common on the political right than on the political left. I argued that the survey on which his claim was based was, probably not intentionally, loaded. Questions about respect for authority consistently referred to authorities more popular on the right than the left, questions about people bravely defying authority referred to forms of defiance more popular on the left than on the right, hence people on the left would appear, by their score on his questions, less authoritarian than they were, people on the right more. I recently encountered the same problem in a different context, this time an article describing a study that purported to show that people on the right are more often misinformed about public issues than people on the left.
The obvious way to rig the results of such a poll is to select questions where the answer you consider mistaken is more popular with one side than the other. Most people who believe Obama was not born in the U.S. are on the right. Most people who believe the Chamber of Commerce used foreign money to influence the most recent election are on the left. By my count, for at least seven of the eleven questions the answer that the study's authors considered misinformed was a view more popular with the right than the left. One—the Chamber of Commerce question—went the other way.
A second problem with the study was that, for at least three of its eleven questions (whether stimulus had saved several million jobs, whether the economy was recovering, whether Obamacare increased the deficit), the right answer was unclear. In none of the three did the study's authors provide adequate support for their view—which, in each case, coincided with the claims of the Administration.
I first heard of the study via a critical piece on Reason's blog. A while later, I came across another reference to it, a Usenet post by someone who obviously approved of its conclusions. I responded and pointed out the problems.
With regard to the three questions where the study's answer was less obviously correct than its authors thought, I can easily imagine a reasonable person disagreeing with me, arguing that the study at worse mildly exaggerated how clear the right answer was. I do not, however, see how any reasonable person could fail to see the way in which the selection of questions was biased, once it was pointed out.
I am now waiting to see if there is anyone reading that particular Usenet thread who is willing to admit that the evidence for a conclusion he likes is bogus.
The obvious way to rig the results of such a poll is to select questions where the answer you consider mistaken is more popular with one side than the other. Most people who believe Obama was not born in the U.S. are on the right. Most people who believe the Chamber of Commerce used foreign money to influence the most recent election are on the left. By my count, for at least seven of the eleven questions the answer that the study's authors considered misinformed was a view more popular with the right than the left. One—the Chamber of Commerce question—went the other way.
A second problem with the study was that, for at least three of its eleven questions (whether stimulus had saved several million jobs, whether the economy was recovering, whether Obamacare increased the deficit), the right answer was unclear. In none of the three did the study's authors provide adequate support for their view—which, in each case, coincided with the claims of the Administration.
I first heard of the study via a critical piece on Reason's blog. A while later, I came across another reference to it, a Usenet post by someone who obviously approved of its conclusions. I responded and pointed out the problems.
With regard to the three questions where the study's answer was less obviously correct than its authors thought, I can easily imagine a reasonable person disagreeing with me, arguing that the study at worse mildly exaggerated how clear the right answer was. I do not, however, see how any reasonable person could fail to see the way in which the selection of questions was biased, once it was pointed out.
I am now waiting to see if there is anyone reading that particular Usenet thread who is willing to admit that the evidence for a conclusion he likes is bogus.
11 comments:
Do you have evidence that the questions were chosen specifically in order to create such bias?
I would suggest that it's more likely that the questions were chosen because they were the most significant and common misunderstanding raised and that the fact that they are more common among rightists is not surprising in light of the literature suggesting that such misconceptions are more common among rightists.
Indeed, trying to select questions with the intention of "balancing" left-leaning and right-leaning errors rather than selecting questions impartially would really be loading the dice.
You do a good job outlining why the questions in the study mean that the results don't really hold very much weight.
There are also methodological issues with the whole study that further lower its claims to veracity.
The unsuitable choice for a measure of partisanship, it's choice of study, and the self-reporting of media content all mean that the results should be taken with a very large grain of salt.
See here for the full analysis: http://noompa.wordpress.com/2010/12/20/everybody-calm-down/
I don't think you can look at the selection of questions in a vacuum and see how biased it it. You seem to be ignoring priors.
Suppose, for instance, that it so happens that people on the right were twice as likely to hold a misconception about a false belief about a political issue. Then, if the authors selected issues by balancing ideology, they'd be rigging the test the other way.
I don't immediately see a way to solve that problem. You could go through talk shows and political debates and so on for some period and extract issues, then randomly sample from them, but you're missing the notion of importance. You could take that corpus and have a few experts from either side of the spectrum pick out the most important issues, then use those, plus the top few in terms of discussion over the year.
But at the end of a fair process, you still couldn't look at the questions selected and see by thinking about whether the wrong view is more popular on one side or the other. That's inextricably linked with the result you're trying to study.
(Note, I haven't read the study, I have no idea what their process was or if it was reasonable.)
What amazes me is how impoverished an understanding of these contested issues the study's authors have. They believe they've addressed your complaint!
>I am now waiting to see if there is anyone reading that particular Usenet thread who is willing to admit that the evidence for a conclusion he likes is bogus.
If you want to understand the real world, rather than support your team, you need to bend over backward - and especially question anything that supports your pre-existing beliefs. Most people are much more interested in rooting for their team though than in any real knowledge.
This reminds me to a degree of a survey I got one time from my state's Democratic party. I don't remember the questions, but I do remember that in order to have your survey response counted you had to remit a "processing fee" -- $20.00, I think. Thus, even if your views disagreed with the Democratic party's view, you still ended up giving them a monetary contribution.
David,
What are you crazy. Everyone knows that liberals are more educated, open-minded, cultured and overall intelligent. The liberal position is by it's almost very definition the scientific, rational, logical position that any thinking person must come to. This is why the vast majority of university professors are left-wing, because 1) they're the smartest and most rational people in the country, 2) unlike those in the private sector are at institutions dedicated to the noble cause of truth seeking rather than base money grubbing.
If a person is not liberal it's either A) because he's brainwashed by backwards reactionary institutions like church, the military or corporate propaganda (e.g. Fox News), or B) he's actively involved in screwing over the poor. There's simply no other explanation for why so many people oppose free medicine but favor tax cuts on the wealthiest 1%. They're delusional.
The only way to ameliorate this problem and save democracy is by massively increasing the support going to the few sane voices that aren't trying to screw over the people. This means massive funding increases for public school teachers, social workers, universities, policy-minded think tanks, government workers at all levels, minority outreach, etc. The only way to counteract all the corporate propaganda is with education that promotes scientific and rational public policy.
I did not read the study, but I read a report saying one question was that the health care bill did not increase deficit. I think the fact they are looking for is that the CBO projected it being deficit neutral, but the projections are controversial.
The CBO did indeed project that the bill would slightly reduce the deficit. It also issued a disclaimer almost immediately, which effectively said: "Those are the numbers Congress told us to find. To do so, Congress told us to make assumptions that we know to be false. Here's how the numbers come out without those assumptions. The bill will increase the deficit, exactly as everybody with any sense expected."
for at least three of its eleven questions (whether stimulus had saved several million jobs, whether the economy was recovering, whether Obamacare increased the deficit), the right answer was unclear. In none of the three did the study's authors provide adequate support for their view—which, in each case, coincided with the claims of the Administration.
I was under the impression that the questions in the survey were not "whether the stimulus had saved several million jobs," etc. but rather "whether most economists believe the stimulus saved several million jobs," and likewise for the other two questions.
"Most economists" don't have to be right, and you don't have to agree with them, in order to correctly report their opinions. All you need is a definition of "economist", and in principle you can answer these questions with a survey.
That said, I don't claim to know what "most economists" believe about these three issues.
Hudebnik correctly points out that the question was about a majority of economists--actually, of economists who had studied the question. The evidence the survey provided to support its conclusion consisted of two parts:
1. The HBO report of what their model said the effect had been--certainly not a report of the opinion of a majority of economists who had studied the question.
2. The opinion of a WSJ panel of (I think) 35-50 economists, a considerable majority of whom thought the effect of the stimulus had been positive. "Positive" doesn't equal several million jobs, no evidence was offered that the panel represented a random sample of economists who had studied the question.
So while it's possible that their answer is correct, they provide no adequate reason to think so.
Post a Comment