2. Diagnose the problem3. Explore solutions

Don’t over-generalize your expertise

By September 29, 2011 No Comments

My good friend Paula told me how she once visited a friend who was undergoing chemotherapy at the hospital. She found him lying in bed, messing with the dial for his intravenous probe. He was in pain and wanted to increase the quantity of pain killer delivered by his drip. Paula offered to call his nurse, to which he answered, “Paula, I’m a member of the National Academy of Engineering, surely I can figure out how this machine works.”

In his excellent book, The Halo Effect, my IMD colleague Phil Rosenzweig warns about various biases that cloud our judgment as managers and decision makers. One of those is the, well, halo effect: The tendency we have to let a positive impression we might have of something influence how we feel about it in other areas.

It’s not because you’re good somewhere that you’re good everywhere

Sometimes your knowledge and skills will transfer from your subject of expertise to another. Just like the aerobic capacity that, say, a cyclist develops on the bike might serve her to also run well, our use of logic and evidence in solving problems, for instance, will transfer well from one subject matter to another.

But sometimes knowledge and skills don’t transfer well. An expert cardiologist’s opinion on whether oscillations in the sun’s magnetic field have an impact on climate isn’t necessarily more accurate than yours or mine (apologies to you, dear reader, if you are an expert in astronomy). In fact, an expert cardiologist might not be more knowledgeable about mental illnesses than most of us.

It’s important to recognize these limitations because they have a direct impact on our thinking as a whole. People who are used to be in a position of authority—such as a college professor teaching a subject in which they are a world expert—tend to get used to being a power figure. Place this individual in a brand new situation, and they might display the same authority. They might speak with the same assurance and silence their critics with the same zeal as if they would if they were dealing with their subject of expertise. But they don’t have the credentials to do this here.

Know what your limitations – in doubt, err on the side of caution

To illustrate, let’s put this in a 2×2. One axis is reality, the other is your perception. Then for any given subject, you’re either knowledgeable or you’re not. If you know your place—either as knowledgeable or ignorant—then you’re safe, well calibrated (in a green box). If you are unaware that you are knowledgeable (orange box), it’s a shame because you aren’t fully using your abilities. It is a shame, but not necessarily dangerous. The real dangerous part is when someone is ignorant but doesn’t realize it (red box). Do yourself a favor, don’t be that guy.

Differentiate hard evidence from guesses

So whenever you’re approaching a new situation, it’s probably healthy to use a great deal of caution. This starts by doubting your intuition, particularly on a new subject. That is, remember what you’re good at.

A practical way to do this in our approach is to look at the analysis you develop for each branch in your question map and clearly label which parts are based empirical evidence and which are only guesses. In fact, when making recommendations to your boss, spell it out explicitly. This is a form of intellectual honesty that might come across as surprising—in a day and age where people tend to self-label themselves experts on anything and where looking assured is half the battle—but hopefully, it will be refreshing and well received. Otherwise, you might consider changing your boss.

References:

Ariely, D. and G. Loewenstein (2006). “The heat of the moment: The effect of sexual arousal on sexual decision making.” Journal of Behavioral Decision Making 19(2): 87-98.

Chevallier, A. (2016). Strategic Thinking in Complex Problem Solving. Oxford, UK, Oxford University Press, p. 141.

Rosenzweig, P. (2007). The halo effect:… and the eight other business delusions that deceive managers, Free Press.

Simonsohn, U. (2007). “Clouds make nerds look good: Field evidence of the impact of incidental factors on decision making.” Journal of Behavioral Decision Making 20(2): 143-152.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.