by Daniel Kahneman, a psychologist who won (and probably deserved to win) a Nobel prize in economics, is a
book well worth reading; I just finished it. Its subject is how the human mind works and, in particular, why we make the predictable mistakes that we do make.
The central insight is that we act as if we had two different mechanisms for making sense of the world around us and deciding what to do. System 1—intuition broadly defined—works automatically and very quickly to recognize a voice over the phone, tell whether a stranger's face is expressing anger, generate conclusions on a wide range of subjects. System 2—conscious thought—takes the conclusions generated by System 1 and either accepts them or rejects them in favor of its own conclusions, generated much more slowly and with greater effort. Attention is a limited resource, so using System 2 to do all the work is not a practical option.
System 1 achieves its speed by applying simple decision rules. Its view of probability, for instance, functions largely by classifying gambles into three categories—impossible, possible, or certain. One result is that an increase in probability within the middle category, say from 50% to 60%, appears less significant than an increase of the same size from 0% to 10% or from 90% to 100%.
That simple fact provides a solution to a very old problem in economics, the lottery-insurance puzzle. If someone is risk averse, he buys insurance, reducing, at some cost, the uncertainty of his future. If someone is risk preferring, he buys lottery tickets, increasing, at some cost, the uncertainty of his future. Why do some people do both?
Kahneman's answer is that insuring against your house burning down converts a very unattractive outcome (your house burns down and you are much worse off as a result) from probability 1% to probability 0%, a small gain in probability but a large gain in category (from possible to impossible). Buying a lottery ticket converts a very attractive outcome (you get a million dollars) from probability 0% to probability .001%, a small gain in probability but a large gain in category (from impossible to possible). Both changes are more attractive, as viewed by System 1, than they would be as viewed by a rational gambler.
If you have read Nudges, many of the errors Kahneman describes will be already familiar to you. The difference is that Thaler and Sunstein take those errors as observed facts; Kahneman explains, for the most part plausibly, why we make them, and supports his explanations with evidence. And while Kahneman has a few comments on political implications of his results, his main focus is on telling the reader what mistakes he is likely to make and why, in the hope of helping him to make fewer of them.
One of the attractions of Kahneman's book is that although some of his evidence consists of descriptions of the results of experiments, his own or others, quite a lot of it consists of putting a question to the reader and then pointing out that the answer the reader probably offered, the one most people offer, is not only wrong but provably, in some sense obviously, wrong.
Consider the following example:
Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Which is she more likely to be:
A bank teller
A bank teller and active in the feminist movement
Most of the people to whom the question was put judged the second alternative as more likely than the first—despite that being logically impossible. System 1 has a weak grasp of probability and so, in this case as in many others, substitutes for the question it cannot answer an easier question it can, in this case “which sounds more like a description of Linda.”
The book is more than four hundred pages long; if I tried to summarize all of it this would be a very long post. Read it.