The Single Cause Fallacy

by | Feb 28, 2019


“The S&P 500 Drops 2% as Global Recession Fears Are Stoked by Disappointing Retailer Earnings.”

These sorts of headlines always amuse me. The stock market has millions of participants on any given day and billions of dollars and hundreds of millions of shares change hands. For every person (or computer) who wants to sell Apple there is another person who wants to buy it. Every person who wants to sell GE has their own reason as does the purchaser of GE. There are many underlying causes of the movement of each and every stock and the changes in individual stocks roll up to cause the change in the stock market for the day. Yet, headlines scream a single cause.

We prefer single and simple explanations in areas way beyond just the stock market. In nearly all aspects of life, we prefer simple explanations, including politics, origins of the universe, personal successes and failures as well as the wins or losses of our sports teams (looking at you Saints fans as well as Cards fans who attribute losing the ’85 World Series to a bad call by Don Denkenger).

Attributing a single cause to a situation has several names including the single cause fallacy, causal reductionism, and the single perspective instinct. We tend to weigh an explanation’s simplicity as well as its probability when judging its potential veracity, but we tend to attach more weight to simplicity.

In a fascinating research study, UC Berkeley psychologist Tania Lombrozo investigated the interplay of the simplicity of explanations and their probability. In a series of experiments, she presented volunteers with information about diseases that could cause various symptoms. The participants tended to choose single disease causes for symptoms over two disease causes for the symptoms even when the two disease explanations were presented as more probable. Thus, Dr. Lombrozo found that while both simplicity and probability play a role in choosing among possible causal explanations, “simplicity plays a privileged role” when we choose between explanations. Data from the study suggested that “simpler explanations are assigned higher prior probability” than more complex explanations and thus complex explanations must be quite a bit more probable before they are chosen as the likely cause.

Why do we have a bias for simpler explanations? Dr. Lombrozo posits that simplicity seems to have better explanatory power than more complex explanations. In other words, simpler explanations often tell a more compelling story, they are easier to communicate and usually take less brain power.

Another aspect of the single cause fallacy was highlighted by Hans Rosling in his fantastic book Factfulness. We tend to grasp a simple explanation and apply it broadly – much more broadly than it should be. When we have a hammer, everything looks like a nail. “For example, the simple and beautiful idea of the free market can lead to the simplistic idea that all problems have a single cause—government interference—which we must always oppose; and that the solution to all problems is to liberate market forces by reducing taxes and removing regulations, which we must always support. Alternatively, the simple and beautiful idea of equality can lead to the simplistic idea that all problems are caused by inequality, which we should always oppose; and that the solution to all problems is redistribution of resources, which we should always support.”

According to Dr. Rosling, it is easy for us to think like this:

We find simple ideas very attractive. We enjoy that moment of insight, we enjoy feeling we really understand or know something. And it is easy to take off down a slippery slope, from one attention-grabbing simple idea to a feeling that this idea beautifully explains, or is the beautiful solution for, lots of other things. The world becomes simple. All problems have a single cause—something we must always be completely against.

It saves a lot of time to think like this. You can have opinions and answers without having to learn about a problem from scratch and you can get on with using your brain for other tasks. But it’s not so useful if you like to understand the world. Being always in favor of or always against any particular idea makes you blind to information that doesn’t fit your perspective. This is usually a bad approach if you like to understand reality.:

Related Ifods: The Linda Problem and the Conjunction Fallacy

Ignorance and the Dunning-Kruger Effect

Dichotomous vs. Dialectical Thinking

1 Comment

  1. Thanks for this useful insight into how we process information. This should be mandatory reading for all Politicians.

Subscribe To The IFOD

Subscribe To The IFOD

If you'd like to subscribe to get email notifications when there is a new post, please enter your email below. You can unsubscribe at any time.

You have Successfully Subscribed!

Share This