## Tuesday, December 13, 2011

### Thinking Fast and Slow

by Daniel Kahneman, a psychologist who won (and probably deserved to win) a Nobel prize in economics, is a book well worth reading; I just finished it. Its subject is how the human mind works and, in particular, why we make the predictable mistakes that we do make.

The central insight is that we act as if we had two different mechanisms for making sense of the world around us and deciding what to do. System 1—intuition broadly defined—works automatically and very quickly to recognize a voice over the phone, tell whether a stranger's face is expressing anger, generate conclusions on a wide range of subjects. System 2—conscious thought—takes the conclusions generated by System 1 and either accepts them or rejects them in favor of its own conclusions, generated much more slowly and with greater effort. Attention is a limited resource, so using System 2 to do all the work is not a practical option.

System 1 achieves its speed by applying simple decision rules. Its view of probability, for instance, functions largely by classifying gambles into three categories—impossible, possible, or certain. One result is that an increase in probability within the middle category, say from 50% to 60%, appears less significant than an increase of the same size from 0% to 10% or from 90% to 100%.

That simple fact provides a solution to a very old problem in economics, the lottery-insurance puzzle. If someone is risk averse, he buys insurance, reducing, at some cost, the uncertainty of his future. If someone is risk preferring, he buys lottery tickets, increasing, at some cost, the uncertainty of his future. Why do some people do both?

Kahneman's answer is that insuring against your house burning down converts a very unattractive outcome (your house burns down and you are much worse off as a result) from probability 1% to probability 0%, a small gain in probability but a large gain in category (from possible to impossible). Buying a lottery ticket converts a very attractive outcome (you get a million dollars) from probability 0% to probability .001%, a small gain in probability but a large gain in category (from impossible to possible). Both changes are more attractive, as viewed by System 1, than they would be as viewed by a rational gambler.

If you have read Nudges, many of the errors Kahneman describes will be already familiar to you. The difference is that Thaler and Sunstein take those errors as observed facts; Kahneman explains, for the most part plausibly, why we make them, and supports his explanations with evidence. And while Kahneman has a few comments on political implications of his results, his main focus is on telling the reader what mistakes he is likely to make and why, in the hope of helping him to make fewer of them.

One of the attractions of Kahneman's book is that although some of his evidence consists of descriptions of the results of experiments, his own or others, quite a lot of it consists of putting a question to the reader and then pointing out that the answer the reader probably offered, the one most people offer, is not only wrong but provably, in some sense obviously, wrong.

Consider the following example:
Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Which is she more likely to be:

A bank teller
A bank teller and active in the feminist movement
Most of the people to whom the question was put judged the second alternative as more likely than the first—despite that being logically impossible. System 1 has a weak grasp of probability and so, in this case as in many others, substitutes for the question it cannot answer an easier question it can, in this case “which sounds more like a description of Linda.”

The book is more than four hundred pages long; if I tried to summarize all of it this would be a very long post. Read it.

At 4:55 PM, December 13, 2011,  Phil Birnbaum said...

But I believe Kahneman's answer to a similar (but different) question is wrong. It's #2, here:

At 4:56 PM, December 13, 2011,  Phil Birnbaum said...

At 6:03 PM, December 13, 2011,  David Friedman said...

Phil:

I agree that the article's answer is wrong. But that example is not in the book, and I wonder if the error is by the author of the article not Kahneman.

At 7:04 PM, December 13, 2011,  Phil Birnbaum said...

David,

Could be. However, the article does say, "When Kahneman and Tversky performed this experiment, they found that a large percentage of participants overestimated the likelihood that Jack was an engineer, even though mathematically, there was only a 30-in-100 chance of that being true."

But, agreed, the article may have clumsily adapted another Kahneman/Tversky experiment without saying so, one in which the given answer is correct.

At 7:29 PM, December 13, 2011,  Phil Birnbaum said...

A little research (here (.PDF) and here) leads me to the conclusion that question #2 does indeed from the research of Kahneman and Tversky, but the incorrect answer comes from the Vanity Fair writer.

From what I gather, Kahneman and Tversky showed that people don't adjust their answer to #2 enough when the question is changed to vary the proportion of lawyers in the population. However, it doesn't look like they ever suggested that the Vanity Fair answer is correct.

At 7:41 PM, December 13, 2011,  David Friedman said...

Phil: Thanks. That makes sense.

I wonder if anyone has managed to get in touch with the author, and if so whether he would be bothered to discover that what he wrote was not true. I tried to find his email, but without success.

At 9:48 PM, December 13, 2011,  Jehu said...

Phil,
While I have no particular confidence in the average person's grasp of conditional probability, one has to recognize this:

The traits cited have a vanishingly low frequency among the population of lawyers, but a very high frequency among engineers---my father-in-law, for instance, matches this profile ALMOST to a T. So even if there were, say 90 lawyers and 10 engineers in the sample, the difference between the probabilities of the various traits between the lawyer and engineer population would probably STILL put the answer into the highest probability range.

At 11:01 AM, December 14, 2011,  Anonymous said...

I also believe the answer to question #4 is wrong. There are more factors to consider than the answer implies. The first thing that came to my mind when I answered 'no' to the question, was that I wouldn't like to pay for a new ticket right away without at least trying to persuade the staff to admit me anyway, and that if their policy was so strict as to deny me, I wouldn't like to support the group with further payment.

At 11:38 AM, December 14, 2011,  Vlad Tarko said...

Herbert Gintis has a different take on the feminist bank teller dilemma in his review of the book:

"there is another interpretation according to which the subjects are correct in their judgments. ... the probability that a randomly chosen bank teller is Linda is probably much lower than the probability that a randomly chosen feminist bank teller is Linda. Another way of expressing this point is that the probability that a randomly chosen member of the set 'is a feminist bank teller' may be Linda is greater than the probability that a randomly chosen member of the set 'is a bank teller,' is Linda."

At 11:51 AM, December 14, 2011,  steve rose said...

Am I missing somerthing?
If A = "Linda is a bank teller" and B = "Linda is a bank teller and active in the feminist movement" then the probability that B is more likely than A is zero. But this is not the problem the test subjects were given. They were also given a story about Linda, call it C, and had to estimate the conditional probability of B given C. Now prob(B) > prob(A) does not necessarily equal zero. For example, my best friend Bob, whom I know to be hot for feminists, tells me story C. I correctly conclude B is more likely than A since otherwise Bob would not be telling me the story.
I predict that if the question were given without C the results would be different. It appears to me it is Kahneman who goofed, not the test subjects. Btw, what did he get the Nobel for?

At 11:55 AM, December 14, 2011,  Anonymous said...

Sometimes people (particularly those educated in public schools) go into "quiz mode" when given any type of quiz and try to eliminate all but one of a multiple choice answer (this is advice given to takers of standardized tests). This creates a bias towards the idea that two of the answers could not be both correct, since tests in public school usually include a third option like "both A and B" that you have to bubble in. When I read the feminist banker question, I instantly remembered my truth tables I learned at the college and got the answer right, but my first instinct was driven by test-taking habits and I would have gotten the answer wrong had it not been for the logic class.

At 12:13 PM, December 14, 2011,  Unknown said...

@Steve

You're still wrong. Whenever you add more complexity to a statement the probability will always be less than or equal to the simpler statement.

At 12:43 PM, December 14, 2011,  Anonymous said...

I think that one reason people choose "Linda is a bank teller and active in the feminist movement" over "Linda is a bank teller", is they implicitly interpret "Linda is a bank teller" in the context of seeing the choices as "Linda is a bank teller and is NOT active in the feminist movement."

At 7:23 PM, December 14, 2011,  David Friedman said...

I described only one part of the experiment involving Linda. Other parts involved giving the different alternatives to different people, or giving a list of alternatives of which those were only two.

If one person is asked how likely it is that Linda is a bank teller and a different person asked how likely it is that she is a bank teller and active in the feminist movement, there is no reason why the first will interpret "bank teller" as "bank teller who is not active in the feminist movement."

Yet, on average, subjects report a higher probability for the second alternative than the first.

More generally, I don't think there is a lot of point to critiquing the particular example without first reading the book.

At 9:40 PM, December 14, 2011,  GregS said...

I also recently finished the book. It's great. I have to second the recommendation that curious individuals read it. I've been reading a lot from the "your brain is systematically lying to you" genre. "Thinking Fast and Slow" is the most comprehensive I've read so far.

I'll have to read "Nudge," since it keeps being referenced in other books I've been reading. Kahneman may have been sparing on the political implications of his research, but he offered enough recommendations to raise my eyebrows. I distrust anything from the "humans are irrational; therefore they need to be controlled by an irrationality-correcting government" school of thought.

I like Arnold Kling's take on the book:
http://american.com/archive/2011/december/the-political-implications-of-ignoring-our-own-ignorance

At 10:28 PM, December 14, 2011,  David Friedman said...

Greg mentions the political implications of Kahneman's work.

In my view, they are probably on net favorable to the libertarian position. They imply that individuals making decisions on the market are less rational than economic theory assumes. But they also apply to individuals making political choices--deciding who to vote for or what policies to favor.

The arguments suggest that people are more nearly rational when they use the slow mind than the fast and, since the slow mind's attention is a scarce resource, they are more likely to use it the more important getting a decision right is. My market decisions are almost always more important to me than my political decisions, since the former directly affect outcomes for me, the latter do not. That suggests that people will he less rational in their political decisions than their market decisions.

Further, since Kahneman's arguments imply that different people will be irrational in the same way, their errors won't cancel out in the political system, as random errors might.

At 2:17 AM, December 15, 2011,  martin said...

If someone is risk averse, he buys insurance, reducing, at some cost, the uncertainty of his future. If someone is risk preferring, he buys lottery tickets, increasing, at some cost, the uncertainty of his future. Why do some people do both?

Because there's a big difference between gambling with your house, your car or your health, and gambling with the price of a lottery ticket?

At 9:00 AM, December 15, 2011,  Anonymous said...

David Friedman Writes:
>More generally, I don't think there is a lot of point to critiquing the particular example without first reading the book.<

400 pages, and THEN I am allowed to critique? Damn, that's harsh. I'm used to criticizing things immediately after reading the wikipedia article. I skim the wikipedia article if it's too long.

At 9:07 AM, December 15, 2011,  Allan Walstad said...

"They imply that individuals making decisions on the market are less rational than economic theory assumes."

Mainstream neoclassical theory, perhaps. Optimality analysis short-circuits the whole process by which people pursue their goals through choices that may not be informed by perfect knowledge or perfect reasoning. Austrian theory is ABOUT that process. The specific models (scenarios) of course require assumptions, whether explicit or implicit, regarding knowledge and its use by individual market participants, but these are not part of the theory itself -- at least, not as I understand it.

At 5:39 PM, December 16, 2011,  Hume said...

"That suggests that people will he less rational in their political decisions than their market decisions."

Playing devil's advocate, this could also ground a theory of political representation based on the model of a trustee relationship. Political representatives make decisions separate from the wishes of their constituents, and do so in a deliberative assembly after hearing arguments from other representatives (who are also independent of the "will of the people"). So we have a division of cognitive labor where political officers are given the ability to make slow, informed decisions.

At 2:32 PM, December 19, 2011,  Anonymous said...

Dave, you should look at this guy's academic work on behavioural econ. I did a module in behavioural econ and it REALLY broadened my understanding of econ. behavioural is in a VERY early stage right now but the models they are coming up with are really interesting.

At 12:39 PM, December 24, 2011,  Ilíon said...

"Most of the people to whom the question was put judged the second alternative as more likely than the first—despite that being logically impossible. System 1 has a weak grasp of probability and so, in this case as in many others, substitutes for the question it cannot answer an easier question it can, in this case “which sounds more like a description of Linda.”"

Then again, one can view it as an instance of an ill-formed question being recast into into what the hearer (rightly or wrongly) understands the questioner to really be asking.

At 6:20 AM, April 30, 2013,  Rahul said...

The book is written to keep the mind captivated in understanding how the mind works. Each theory, each information is empirically supported. It equips you with the technical terms for the everyday psychological stuff you knew happened but did not know how or why.