I love reading about cognitive fallacies because I feel like I’ve improved as a person after I’ve learned about the fallacy. On the other hand, my colleague Spencer Burke says “reading about cognitive biases and fallacies makes you feel smarter, but in all likelihood you’ll continue to make the same irrational mistakes over-and-over anyway.” Spencer is probably correct, but I still find it oddly satisfying still to read about cognitive biases and fallacies.
With that caveat out of the way, here’s the “Linda Problem” as proposed by Daniel Kahneman and Amos Tversky in 1983:
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable:
1. Linda is a bank teller, or
2. Linda is a bank teller and is active in the feminist movement
Give this some thought before continuing.
More than 80% of people when given this fact situation choose option 2 over option 1. This is the wrong answer. Option 1 is the more probable because Linda must merely be a bank teller whereas Option 2 requires Linda to be a bank teller AND an active feminist.
The Linda Problem is an example of the “Conjunction Fallacy” which occurs when we think a subset seems larger than than the entire set. According to the book The Art of Thinking Clearly, the conjunction fallacy occurs because “we have an innate attraction to ‘harmonious’ or ‘plausible’ stories.” The more vivid an example is, the more we tell stories in our head, and the more likely we are to pick a result that conforms to our story. The Linda Problem helpfully diagrammed:
Here’s another one; which is more probable:
1. Chicago O’Hare Int’l Airport is shut down to bad weather, flights are cancelled.
2. Chicago O’Hare Int’l Airport is shut down, flights are cancelled. OR
The correct answer is 2 as it only says the airport is shut down where as option 1 says the airport is closed but it has to be due to the weather. While bad weather might be the most likely reason for the airport to close, its not the only reason (there could have been a bomb threat, a plane is coming in to make an emergency landing, there’s a labor strike, etc). I struggled with this one – I’ve experienced delayed or cancelled flights due to weather so often (especially at O’Hare) that adding the weather as an explanation seems to make it more likely.
Want more? Here are some real ones experts got wrong:
In 1982, at an international conference for research, experts – all of them academics – were divided into two groups. To group A Daniel Kahneman presented the following forecast for 1983: “Oil consumption will decrease by 30%.” Group B heard that “A dramatic rise in oil prices will lead to a 30% reduction in oil consumption.” Both groups had to indicate how likely they considered the scenarios. The result was clear: group B felt much more strongly about its forecast than group A did. (Source: The Art of Thinking Clearly)
In 1982, an experiment was done with professional forecasters and planners. One group was asked to assess the probability of “a complete suspension of diplomatic relations between the U.S. and the Soviet Union, sometime in 1983,” and another group was asked to assess the probability of “a Russian invasion of Poland, and a complete suspension of diplomatic relations between the U.S. and the Soviet Union, sometime in 1983.” The experts judged the second scenario more likely than the first, even though it required two separate events to occur. They were seduced by the detail. (Source: The New Yorker)
Here are some other cognitive bias and fallacy IFODs:
Decision-Making and the Resulting Fallacy
Decision-Making Under Uncertainty: The Ellsberg Paradox