One of the most dynamic areas of psychological research over the past two decades has been decision-making and cognitive bias. Economists like to assume that agents are rational, but researchers have discovered what many non-economists already know, which is that humans are bounded in several ways: in the time they take to make a decision, in their rationality, awareness, and ethics. It turns out that almost none of us accurately assess risk. This has implications for how we manage (and your next Vegas trip).
One example of cognitive bias is the “overconfidence bias” (described by Daniel Kahneman in his wonderful 2012 bestseller Thinking Fast and Slow). For example, what is the annual revenue of Exxon? Okay, maybe you don’t know the exact number but could you estimate a revenue range for Exxon with 90% certainty? You might say, “Well, it’s probably somewhere between $10 billion and $50 billion.” In this example you’d be right: Exxon happens to bring in $41 billion per year. But it turns out that if we ask a large number of people to estimate a range, the true number falls outside their range almost half the time. People think they can be more precise than they can.
What makes this result really interesting is that experts do worse than lay people. Confidence increases faster than skill. Experts give a narrower range (they might say $30 to $40 billion), but the true number falls outside the range they give more often than it does for non-expert responders. Be wary of the certainty of experts!
How do we square this with a body of research, by Gary Klein and others, which demonstrates the amazing intuition that some experts develop? Klein (1998) describes a firefighter who can assess in a second whether the building he has just entered is about to collapse, and he documents several cases where expert intuition saves the day.
In 2009, Kahneman and Klein set out to see if they could reconcile these very different conclusions about whether we can trust experts. Ultimately, they concluded that intuitions are skilled when an expert operates in a predictable environment with frequent feedback. This suggests that we can trust: physicians, nurses, athletes, firefighters, police, and other professionals who make decisions and learn very quickly if it was a good decision or not. A quarterback discovers quickly whether his decision to throw the short pass during an unexpected blitz was a good idea or not. With frequent feedback, decision-making becomes skilled enough that instant intuition can work very well. Malcolm Gladwell talks about the 10,000-hour threshold to hone intuition.
Kahneman and Klein concluded that intuitions are unskilled for those making long-term forecasts, where it takes a long time to learn if a decision is correct or not. This means we should not trust the predictions of: political pundits, stock market analysts, or (especially) venture capitalists. They have to wait too long to find out if they are good or not, and to distinguish between skilled and lucky outcomes. Unfortunately, they are particularly prone to an accelerating sense of confidence, so they have a tendency to think they have become experts when in fact they have not. The lesson: beware of anyone with a long feedback loop. These are the people you are most likely to see on TV and in newspapers.