It’s a small world, but I’d hate to have to paint it.” —Steven Wright
How confident are you about your ability to judge your own expertise? Do you think you generally know what you are good at and what you are bad at?
Most people suffer from overconfidence bias. They tend to think that they know more than they really do. But surely you, in your infinite wisdom, are not one of these people? Surely you have a good understanding of your own knowledge. Let’s find out, shall we?
On a scale of 1 to 7, how well do you understand how a can opener works? Got a number? Good.
Now, take out a piece of paper and try to draw a diagram of a can opener. If you can’t draw well, write out an explanation, piece by piece.
Take out a piece of paper and do it. (Doing it in your head doesn’t count and it will take 30 seconds). Got a diagram or a written explanation?
If you are thinking “of course I know how a can opener works” then, seriously, try to draw the diagram or write out an explanation.
After you’ve done it, read this description of how a can opener actually works or look at this diagram and re-rate your understanding on a scale of 1-7.
I have tried this and seen other people try it and so far I haven’t seen anyone who has accurately evaluated their own understanding of how a can opener works.
Everyone is overconfident.
A can opener is not that complicated. You’ve used one hundreds, if not thousands, of times in your life. What about more complex topics?
Leonid Rozenblit and Frank Keil, in a 2002 paper showed that people believe they understand familiar manufactured objects (such as can openers) and natural phenomena (such as tides) much better than they actually do.
The researchers had subjects rate their understanding of various objects and phenomena and then asked them to give an explanation. After that, the subjects rated their own understanding again. Their second ratings were much lower.
Most people feel they understand the world with far greater detail, coherence, and depth than they really do… [They] wrongly attribute far too much fidelity and detail to their mental representations because the sparse renderings do have some efficacy and do provide a rush of insight.”
This is known in the psychological literature as the overconfidence effect or overconfidence bias or the Overconfidence Effect.
Overconfidence has been called the most “pervasive and potentially catastrophic” of all the cognitive biases to which human beings fall victim.1
In the case of a can opener, it’s kind of dumb. But, broadly, It has resulted in many lawsuits, strikes, wars, and stock market bubbles and crashes.
Most of the systems we care about from our jobs, businesses, investments, economies, relationships, and health are many times more complicated than a can opener, magnifying the effect.
Reality has a Surprising Amount of Detail
The can opener effect is the product of a simple fact: Reality has a surprising amount of detail.2
It is not merely that reality has a lot of detail, but that we constantly underestimate the amount of detail. It is continually surprising.
Everyone comparing their explanation or diagram of a can opener to how it actually works is surprised by the level of detail it actually takes to accurately describe a can opener.
Now, not understanding how a can opener works probably doesn’t affect your life much. However, not understanding other things can get us in trouble.
Consider the impact on politics. Political Extremism Is Supported by an Illusion of Understanding (Fernbach et al., 2013) asked subjects to explain how proposed political programs they supported would actually work. The result was that participants tried to explain how they would work and realized they didn’t really have a very good idea.3
This decreased their certainty that they would work. The subjects then expressed more moderate opinions and became less willing to make political donations in support of these programs. The Can Opener Effect causes more political polarization!
It causes a lot more than that. If a legal plaintiff is overconfident that they will win, they will file an unnecessary lawsuit. If a nation is overconfident that it will think it is more likely to win a war, it will fight more wars. If an investor is too sure of their estimate of an asset’s value, they will trade too much.
The Can Opener Effect implies that everyone has an oversimplified model of how everything around them works. There’s nothing inherently wrong with this. We must oversimplify the world around us to be able to function in it. There is simply too much for one person to process and comprehend everything.
The problem is that we don’t think our models of how the world works are oversimplified. We think they are accurate. This creates hidden risk.
Hidden Risk: Seeing Like a State
In the late 18th century, the German government started growing “scientific forests” so they could more easily track and harvest timber.
The German government wanted to be able to forecast and plan how much timber needed to be harvested each year to provide enough firewood to their citizens and ships to their sailors.
The underbrush was cleared since it was hard to quantify and it did not produce usable timber. The number of species in the forest was reduced, often to one, because it was easier to track.
The result was plantings of a single species of tree, done in straight rows and grids on large tracts of land. It looked more like a tree farm than a forest.
The first plantings by the government did well because they were able to use the nutrients in the soil that had accumulated over centuries. This created an initial surge in the amount of timber available to the German industry.
This increased the German central planners’ confidence in the plan working. They wanted to do more of a good thing, so they built more scientific forests.
Narrator: It wasn’t actually a good thing…
The clearing of the underbrush reduced the diversity of the insect, mammal, and bird populations that were essential to the soil building process. Since there was only one species of tree, pests and diseases could easily move from tree to tree infecting the entire forest. All of these issues came together to result in massive forest death across the country.
This is the result of the Can Opener Effect on the world around us. By underestimating the amount of detail that made the forests robust, the German planners created hidden risk.
The natural forest was not producing the consistent quality and quantity of timber the central planners had hoped for and so they had tried to improve it.
By ignoring the detailed reality of how a forest produces trees, they made the quantity of timber go up in the short term, but they introduced a risk of massive collapse by overestimating their own understanding.
This actually reduced the total timber available in the long run. They would have been better off doing nothing. The issue was compounded by the fact that they thought it was more stable and predictable than it really was.
The problem with artificially suppressing the volatility and diversity in many systems is not just that the system tends to become extremely fragile and full of hidden risk. It is that, at the same time, it exhibits no visible risk.
The year before the forest yields completely collapsed would have been one of the best years of timber production in history. All signs pointed to even better years ahead. Every year had been better than the last since the new scientific forest project began.
This meant new factories had been built to construct ships with that timber. Home construction may have started to use that timber. Lots of people’s livelihoods were counting on timber production continuing to go up and so the effect of Walderstaben was far more devastating than the prior system where no one was expecting timber production to be consistent and growing.4
In the short term, there is often a tradeoff between robustness and efficiency.
The scientific forests were initially more efficient. For the first few decades of their use, scientific forests produced more timber, more reliably than the older natural forests.
However, and this is the crucial thing, in the long run, they were both less robust and less efficient.
You can improve the gas mileage of your car by getting rid of all the protective equipment that adds weight to it. The dangerous thing though is it’s not always so obvious what you’ve done. You may drive your more efficient car for years, increasingly certain of what a good decision you made until one day, things go bad.
By sacrificing robustness for efficiency in the short term, you get both less robustness and less efficiency in the long run.
This is made worse because the effects are nonlinear: they get worse at an increasing rate. A 7% reduction in the amount of timber available in a given year is a problem that can be adapted to. A 70% reduction in the amount of timber available is far more than ten times as bad. It is a society ending catastrophe. People can’t heat their homes and freeze to death while they lose their jobs and industry grinds to a halt.
Any improvement which makes something 10% more efficient in the short term, but introduces the risk of complete ruin like a 70% reduction in timber yields is a winning bet in the short term, but a losing bet in the long term.
It is akin to playing Russian Roulette every year of your life for a $1 million dollar prize each time you survive. While you might get lucky the first year, second year and third year, the odds of you making it to your 50th birthday are 0.016%.
There is a fairly predictable pattern to how overconfidence bias plays out.
- The Can Opener Effect causes people to gain overconfidence in a simplified model.
- Good early results of using that model lead to increased confidence to use leverage or concentration in that approach to increase efficiency.5
- Increased leverage or concentration results in a hidden risk of ruin. Because the risk is hidden, like the missing seatbelts in a car that only become obvious in a crash, increased leverage or concentration can be used for a long time.
- Eventual collapse leads to less efficiency in the long run.6
This series of events has played out many times in human history.
The Chernobyl nuclear reactor explosion was a result of one employee authorizing a test that they didn’t fully understand. No tests had ever gone wrong in the past, so why worry?
Both the American Civil War and World War I were in part a result of all sides feeling overconfident that they could win the war quickly, gaining glory and prestige with minimum casualties. In both cases, they turned into the deadliest war in history up to that point.
It is also something that plays out in our everyday lives. Investors saving for their retirement often get overconfident and fail to appropriately diversify their portfolios. Many people planning to retire in 2008 or 2009 ended up having to work another decade or more because their savings had been cut in half just as they were about to retire.
Business Managers and entrepreneurs are overconfident about the odds of a project succeeding and the magnitude of the success. They over-invest in the project and lose more money than if they’d properly estimated the odds, sometimes destroying the company.
So how can individuals help to prevent mistakes resulting from overconfidence bias? There’s probably not a perfect answer, but here are some starting points.
How to Manage Overconfidence Bias
Embrace Meta-Rationality
The first, and most obvious solution to overconfidence bias, is just to understand that we all tend to be overconfident. If you assume reality has a surprising amount of detail and you rate your understanding of some subject at a 4, then it is probably a 2 and you really don’t know very much at all about it.
As long as you make that calibration, it’s probably fine. I don’t know anything about heart surgery. More importantly, I know that I don’t know anything about heart surgery and so I am most definitely not going to attempt heart surgery.
By understanding the limits of our own understanding, we can embrace what economist Tyler Cowen calls meta-rationality. True rationality means recognizing the limits of our own understanding and knowing when to defer to experts, which experts to defer to or when to simply admit to that something is impossible for anyone to know.
This is challenging because we are often incentivized to be overconfident. In Thinking, Fast and Slow, economist Daniel Kahneman gives an account of the social pressure on doctors to act as if they understand everything:
Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure. Confidence is valued over uncertainty and there is a prevailing censure against disclosing uncertainty to patients.
An unbiased appreciation of uncertainty is a cornerstone of rationality—but that is not what people and organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred solution.
Those who don’t succumb to this social pressure are often replaced by more confident competitors, who are better able to gain the trust of the clients by promising results more confidently. Sadly, this often ends poorly for everyone as eventually the hidden risk arising from overconfidence catches up and blows things up.
This is perhaps even more pronounced with the advent of social media. No one ever went viral for expressing an uncertain opinion about who is going to win an election – you go viral by making strong, confident statements that align with people’s preconceived beliefs.
That’s why it’s important to allow for uncertainty. The doctor, lawyer, or investment advisor that confidently proclaims they know exactly how to solve the problem 5 minutes into your meeting with them is much more likely to be overconfident than brilliant.
The one who asks many probing questions and then responds with qualifying statements like “I tend to think that for persons in your position, there are three choices with different tradeoffs” probably has a good sense of the limits of their own understanding.
Seek Fingerspitzengefühl
For each individual, there are specific areas where the Can Opener Effect doesn’t exist and they are not overconfident. If anyone reading this happens to be a product designer for a company that makes can openers, they probably came up with a pretty good diagram for a can opener.
Overconfidence bias doesn’t mean that we don’t know anything, just that we tend to overestimate our own knowledge. We all have some areas where we have a deep knowledge of how things work.
Military strategist John Boyd referred to this ability as Fingerspitzengefuhl, a German word that roughly translates to fingertip feeling or intuitive knowledge. Fingerspitzengefühl was considered the key attribute that led to the early success of German tank commanders in World War II. In the North African campaign, the British soldiers ascribed an almost god-like quality to the German tank commander Erwin Rommel. Rommel, known as the Desert Fox, seemed to always know what the British were going to do and was one step ahead of them. For a very specific domain, tank battles in Northern Africa, Rommel understood (almost) all the details.
What having a successful career usually constitutes is acquiring fingerspitzengefuhl in a very specific domain. Indeed, this is the whole point.
Perhaps no one exemplifies this fingertip feeling more than Red Adair. Before the Gulf War of 1990, Adair led the best team in the world that had experience capping wellhead fires on oil rigs. Each fire presented new problems that required a mix of experience and improvisation. Hundreds of factors from the political climate to the wind to the chemical makeup of that particular oil well all played into how to deal with the fire. The reality of capping wellhead fires has a surprising amount of detail so there was no handbook you could use to train someone to sense what Adair understood on an intuitive or tacit level.
For a very specific domain, capping wellhead fires on oil rigs, Red was your man. He had a monopoly on capping wellhead fires and as a result, he named his price.
Following Red’s example, one way to overcome the can opener effect is just to learn about how the metaphorical can opener (your field) actually works.
However, even in areas where you do have a level of intuitive knowledge, it’s probably wise to use a margin of safety.
Use a Margin of Safety
Margin of safety is a principle from investing to build in some margin of safety into investment decisions. If you buy a company that is trading at less than it’s liquidation value (how much it would be worth if you sold of all the assets like building and equipment), then it’s much harder to lose money. The worst-case scenario is that it goes bankrupt and you get the liquidation value.
The concept is a lot more broadly applicable though. Had the forest planners used a margin of safety, they might have only tried their techniques on a few forests and made smaller changes.
When an engineer needs to design a bridge that can support up to 100 cars at a time, they actually plan for it to hold some multiple of that amount, 200 or 300, as a margin of safety.
One of the most common ways overconfidence bias shows up is the planning fallacy, the tendency for people to overestimate their rate of work, or to underestimate how long it will take them to get things done. It is strongest for long and complicated tasks and disappears or reverses for simple tasks that are quick to complete. This makes sense if we assume reality has a surprising amount of detail: long and complicated tasks have far more detail that we tend to overlook.
A relatively simple way to account for this with a margin of safety is just to double the amount of time you think it will take. If this project takes twice as long as you expect and costs twice as much, will it still be a good investment? If so, then there’s probably enough margin of safety in there to account the surprising details you will find along the way.
Conclusion
We all suffer from overconfidence bias to some extent. We overestimate the extent to which we understand reality. This leads us to hold more extreme and overly simplified beliefs about how the world works.
In truth, reality has a surprising amount of detail lying below the surface. Failure to appreciate this often results in the creation of hidden risk. This process follows a predictable four-stage pattern:
- The Can Opener Effect causes people to gain overconfidence in a simplified model.
- Good early results of using that model lead to increased confidence to use leverage or concentration in that approach to increase efficiency.
- Increased leverage or concentration results in a hidden risk of ruin. Because the risk is hidden, like the missing seatbelts in a car that only become obvious in a crash, increased leverage or concentration can be used for a long time.
- Eventual collapse leads to less efficiency in the long run.
This is true across many domains from forestry to investing to starting a new business. Perhaps the most important way to manage it is merely to know that the effect exists and be mindful of it. As Mark Twain quipped, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”
Beyond that, we can:
- Embrace meta-rationality by knowing when to defer to experts and which experts to defer to.
- Expand our intuitive or tacit knowledge, the areas where we truly understand all the possible levels of detail.
- Use a Margin of Safety to account for areas where we may be overconfident.
Last Updated on September 26, 2021 by Taylor Pearson
Footnotes
- Plous, Scott (1993). The Psychology of Judgment and Decision Making. McGraw-Hill Education. ISBN 978-0-07-050477-6.
- Credit to John Salvatier for this phrasing and his explanation of it.
- This is a wonderful idea for how to deal with people screaming their political opinions. Simply ask them to explain, in great detail, precisely how their solution to the problem would work.
- One of the reasons old-growth forests are hard to harvest is that the different types of trees grow in isolated groves. This means if you want to cut down birch trees, you have to either cut down lots of other trees you don’t want or go around them in some way. This seems inefficient. It makes sense on the surface why just planting the trees you want in straight rows increases the efficiency.
But, the groves had at least one major benefit. They helped to prevent “Waldsterben” by reducing the spread of disease. Since trees tend to naturally grow in isolated groves, a disease may wipe out one grove but is less likely to spread to other groves.
The risk of any individual grove dying is higher in a natural growth forest may be higher, but the risk of all the groves dying is lower.
- In the forest example, “concentration” would look like cutting down more and more old-growth forests and replacing them with scientific forests. If scientific forests only make up 2% of the entire forest industry, then failing isn’t that big a deal. Once they make up 70%, it’s a very very big deal.
- One reason people make this tradeoff is that oftentimes, “the long run” is not their problem. The CEO or Private Equity firm that follows this pattern may do fine as long as they can pass the ticking time bomb onto a greater fool before it blows up, a form of moral hazard.