Overconfident much?

Overconfident much?

Dec 27, 2024

A few years back, YouGov asked 2,000 Britons and some 1,200 US adults which animals, if any, they could beat in a fight if they were unarmed (the respondents, not the animals).

Although the list starts with relatively tame creatures—a rat, a house cat, a medium-sized dog—it gets to more difficult opponents, and the proportion of respondents who believe in their success naturally drops. So, for instance, 72% of the US respondents believed in their chances against a rat, but only 23% believed they could defeat a large dog. Fair enough. After all, we would expect a drop; in fact, we would expect a much, much faster drop.

Image credit: YouGov.

Because 15% of US respondents believed they could beat a king cobra! And 9% voted on themselves against a crocodile!! But, wait, it gets better; 8% saw themselves beating the crap out of a gorilla, a lion, or an elephant. Yep, unarmed. They can take on some of the best in the animal kingdom. Impressive, right? Or are these people simply widely overconfident?

We are all overconfident

Overconfidence permeates our thinking in all sorts of settings. Take budget overruns, for instance. The Denver airport ended up costing 200% its original price tag, while the pride and joy of the Anglo-French collaboration, the Concorde supersonic jet, came at 11 times its planned cost, and the Suez Canal was 19 times its own budget (see Flyvberg, 2014, referenced below).

But that's not all; overconfidence also affects project delivery times projects, risk assessments in, say, where we invest, and, well, just about everything we do. So much so that overconfidence has been blamed for the loss of both space shuttles, the 2008 subprime crisis, the sinking of the Titanic, and the nuclear accident at Chernobyl (see Moore, 2015, referenced below).

Discussing this topic in class, I have participants do a little self assessment. I ask them ten numerical questions. Their job is to give two estimates so that they're 90% confident that the correct value is in the interval bounded by the two values.

So, for instance, I might ask them the year in which penicillin was discovered, and they would have to give me two dates so that they're 90% confident that Alexander Fleming's discovery occurred in that range. Try it out for yourself; write down your two dates …

Borrowing from a similar test on clearerthinking.org called "Calibrate your judgement," I give them points if their interval includes the real value. In fact, they get more points the smaller the interval. But, I also warn them that they will naturally tend to make the interval too narrow.

Image credit: clearerthinking.org.

And that's important, because, despite my warning, virtually all respondents make their intervals so small … that they end up missing the real answer. (This is a type of overconfidence sometimes called over precision (see Ferretti, 2023, referenced below).)

Why is that? Well, an explanation is that we can't really give too large an interval. If your boss asked an estimated cost for your proposed project and you came back with, "somewhere between 10 bucks and 2 billion dollars," you wouldn't exactly set yourself up for success. In fact, one could argue that decision making under uncertainty is the art of balancing between humility and assertiveness; between protecting against risk and taking some risks; between being underconfident and being overconfident.

So, whenever we face a complex question, we feel that our answer needs to add value. Too large an interval, and it adds no value. So we reduce the interval. The problem is we "add" so much value that we go from being roughly right to precisely wrong.

Overconfidence, meet the Dunning-Kruger effect

So we tend to frequently over estimate our abilities, especially for harder tasks (see Pulford, 1980). This has a name, the Dunning-Kruger effect, after David Dunning and Justin Kruger, who showed in a 1999 study (linked below) that people's skill in a specific domain tends to be inversely related to their self-assessment of that skill. That is, people who are grossly incompetent at something tend to highly overestimate their competence, in contrast with skilled people who tend to underestimate their own. But don't take my word for it.

Media credit: John Cleese on stupidity.

So what?

Debiasing is hard, and reducing overconfidence is no exception.Researchers have tried a bunch of things, from varying the elicitation presentation format to encouraging the consideration of more information, warning against the bias, and providing immediate feedback (for a review, see Ferretti, 2023). As is often the case with debiasing efforts, results are mixed. However, some techniques hold some promises. One of those is to ask respondents why their chosen answer might be incorrect (see Koriat, 1980, referenced below).

In a recent article, Garrett Lane Cohee and Cora Barnhart also proposed using pre-mortems ("assume that we implement the strategy we're considering, that we're now five years in the future, and that it has failed miserably; what happened?") and reference-class forecasting ("with other, comparable projects, what's the success rate?").

For my part, I've enjoyed using clearer thinking.org's "calibrate your judgment" to see how far off I am when making predictions with a 90%, a 80%, or a 70% confident level (the self assessment lets you choose your target confidence). The answer is, pretty far off, by the way!

But what would really make my day would be to meet the two percent who answered that YouGov survey saying they felt confident taking on an elephant but were turned away by fighting a grizzly bear. I just want to see the look on their face when they explain, "now, then, a grizzly bear; let's be reasonable."

Oh, and Fleming discovered penicillin in 1928.

… and just because this is so much fun …

In a 2019 YouGov survey of 17,000 UK adults, 12% of male respondents thought they could score a point off Serena Williams. That's a survey of the general population, not even just tennis players. And, yup, only 3% of female respondents thought the same.

In a 2023 YouGov survey of 20,000 US adults, almost half (46%) of male respondents were either somewhat confident or very confident that they could safely land a passenger airplane in an emergency relying only on the assistance of air traffic control. ("Only" 20% of female respondents thought the same. Females being significantly less overconfident than males has also been reported in research (see Pulford, 1997, referenced below)

I suspect that some of those respondents saw a movie once and thought it was a documentary. Image credit: Airplane, Paramount Pictures, 1980.

Of course, this is widely unrealistic, as explained by people who know a bit about the subject, such as Brett Venhuizen, who teaches aviation at the University of North Dakota, and observes, "There are a lot of challenges for somebody who has no flight experience, ranging from entering the flight deck to figuring out how to talk to air traffic control, maneuver the airplane and navigate to the airport they plan to land at." Let aside the countless other challenges you'd face, one of those is to set the radio to the international emergency frequency. Any guess what that might be? (it's 121.5; source)

Overconfidence is also present in many other settings, one of them is driving. There's ample evidence that drivers are overconfident; in fact, do you know anyone who thinks that they are a below-average driver? In one study, half the respondents thought they were in the top 81% of drivers (Wohleber, 2015, referenced below).

To find out more

Moore and Healy distinguish three types of overconfidence (see Moore, 2015, (p. 183) referenced below):

  1. Overestimation is thinking that you're better than you are,

  2. Overplacement is an exaggeration of the degree to which you are better than others, and

  3. Overprecision is the excessive faith that you know the truth.


Arkes, H. R., et al. (1987). "Two methods of reducing overconfidence." Organizational Behavior and Human Decision Processes 39(1): 133-144.

Cohee, G. L. and C. M. Barnhart (2024). "Often wrong, never in doubt: mitigating leadership overconfidence in decision-making." Organizational Dynamics 53(3): 101011.

Dunning, D. (2011). The Dunning–Kruger effect: On being ignorant of one's own ignorance. Advances in experimental social psychology, Elsevier. 44: 247-296.

Ferretti, V., et al. (2023). "Testing the effectiveness of debiasing techniques to reduce overprecision in the elicitation of subjective continuous probability distributions." European journal of operational research 304(2): 661-675.

Flyvbjerg, B. (2014). "What you should know about megaprojects and why: An overview." Project Management Journal 45(2): 6-19.

Koriat, A., et al. (1980). "Reasons for confidence." Journal of Experimental Psychology: Human Learning and Memory 6(2): 107.

Kruger, J. and D. Dunning (1999). "Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments." Journal of Personality and Social Psychology 77(6): 1121.

Moore, D. A., et al. (2015). "Overprecision in judgment." The Wiley Blackwell handbook of judgment and decision making: 182-209.

Pulford, B. D. and A. M. Colman (1997). "Overconfidence: Feedback and item difficulty effects." Personality and Individual Differences 23(1): 125-133.

Wohleber, R. W. and G. Matthews (2016). "Multiple facets of overconfidence: Implications for driving safety." Transportation research part F: traffic psychology and behaviour 43: 265-278.

Image credit: ra2 studio