Saturday, May 5, 2012

Daniel Kahneman's Thinking, Fast and Slow

"...[System 1 is] that stranger in you, which may be in control of much of what you do, although you rarely have a glimpse of it."
—Daniel Kahneman 

Daniel Kahneman, along with his partner Amos Tversky, economist Richard Thaler and other collaborators, revolutionized the understanding of how we make choices. Kahneman sought to explain observed departures from the pure rationality assumed by economic theory, and quickly discovered that such departures are the norm rather than the exception.

As an example, you are offered a gamble on a single coin toss. If the coin comes up heads, you lose $100. If it comes up tails, you win $125. Would you accept the gamble? Most people say no. Their aversion to a loss of $100 is greater than their attraction to the prospect of winning $125. The average person requires a potential gain of around three times the potential loss in order to be induced to take this gamble.

So, you might say, in general people don't like to take risks. Consider, though, the next pair of options:
Option A: You lose $50 for sure.
Option B: On a single coin toss, if it comes up heads you lose $100; if it comes up tails, you lose nothing.
Which option would you choose? For many people, it's Option B, even though the potential downside is twice as great as that of Option A. When faced with two possibilities, both bad, people become much more willing to take risks. Kahneman and Tversky developed a theory, called "prospect theory," to explain these apparently contradictory results, an innovation for which Kahneman was awarded the 2002 Nobel Prize in Economics (Tversky died in 1996, and the Nobel is not awarded posthumously).

Kahneman's recent book Thinking, Fast and Slow (Farrar, Straus & Giroux, 2011) is a survey of his life's work, and it's fascinating stuff. He begins by describing the evidence that there are two modes of thought that we each employ. What Kahneman calls "System 1" is automatic and quick, and is engaged for things like emotional responses, intuitions, snap judgments, or well-practiced associations; "System 2" is more analytical, rational, and conscious, and is engaged for things like calculation or logical argument.

Because engagement of System 2 is effortful, it is often bypassed, and so doesn't provide a check on System 1. Here's an example:
If it takes 5 machines 5 minutes to make 5 widgets, how long will it take 100 machines to make 100 widgets? 
For many people the first answer that comes to mind is 100 minutes. But the correct answer is five minutes: it takes each machine five minutes to make one widget, so 100 machines will take five minutes to produce one widget each, for a total of 100 widgets. The correct answer, five minutes, is a System 2 response; the quick but incorrect response, 100 minutes, is a System 1 response.

Experimenters gave a test that included this problem and two others to Princeton students, and found that 90% of the students made an error on at least one problem. (The other two problems:
A bat and ball together cost $1.10, and the bat costs a dollar more than the ball. How much does the ball cost?
In a pond there is a patch of lily pads that doubles in size every day. If it takes 48 days for the patch to entirely cover the pond, how long would it take to cover half the pond? 
Answers given below.)

But the error rate on this test plunged from 90% to 35% when the problems were presented in a hard-to-read font. Counterintuitively, the harder the problems were to read, the easier they were to answer correctly. The reason? The hard-to-read font forced the students to concentrate, engaging System 2, which is more analytical.

The problem is that it is relatively difficult to engage System 2. We often operate on System 1, and while sometimes the consequences are beneficial—we can sense danger very quickly, for example—sometimes they are dire. As Kahneman writes, "A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth." Even more frightening,
"...[Y]ou do not even have to repeat the entire statement of a fact or idea to make it appear true. People who were repeatedly exposed to the phrase 'the body temperature of a chicken' were more likely to accept as true the statement that 'the body temperature of a chicken is 144 degrees' (or any other arbitrary number). The familiarity of one phrase in the statement sufficed to make the whole statement feel familiar, and therefore true." (p. 62)
Any resemblance to recent political discourse is entirely intentional on the part of people who know that a repeated lie will continue to be believed even after a factual rebuttal.

It's the primacy of System 1 as our typical mode of thought that leads to phenomena such as:
  • Anchoring and priming, where random suggestions can affect our estimates and judgments; 
  • Formulation and framing effects, where we can come to diametrically opposed decisions depending on how a choice is presented to us; 
  • The Law of Small Numbers, where we wildly overestimate how representative small samples can be; and 
  • What You See Is All There Is, where we focus on the immediate details of a problem and ignore crucial information from its larger context.
If you are at all concerned with how your decisions can be influenced by elements outside your consciousness and your control, I strongly recommend Thinking, Fast and Slow. But it's also one of the most entertaining books I've read in the past year, filled with puzzles, problems, and brain teasers. You'll never look at apparently simple choices in the same way again—and that's a good thing.

--

The ball costs five cents, not ten cents; the lake would be half-covered in 47 days, not 24 days

No comments :

Post a Comment