My good friend and mentor, stochastic processes über-guru Pol Spanos, has many wise sayings. One is: Intuition is a good servant but a terrible master. You are terribly flawed. But don’t take it personally, dear reader, we all are, suffering from countless biases that lead us to have too much self confidence or prevent us from integrating new evidence. So, we need to follow the Watson rule, and check our assumptions.

Check your assumptions

A report by the National Research Council of the National Academies looked into this and noted that “intuition can be a good tool when: (1) the environment is predictable (so what happened previously is a good predictor of what will be likely to happen again); and (2) the person has had the “opportunity to learn the regularities of the environment” through repeated exposure and feedback.” Unfortunately, these conditions hardly ever exist. So you should know how to use your intuition.

To be clear, intuition is needed even in evidence-based analyses. However, you should know when to use it judiciously. For instance, following your intuition might be good to set an initial game plan—say deciding the order of priority in which you’ll test various hypotheses. But you shouldn’t use it in lieu of empirical evidence when assessing the validity of your evidence. It can’t replace a full-blown analysis.

The corollary is that you need to be ready to say that you are wrong. A central idea in our problem-solving approach is that you should adopt a Bayesian mindset—updating your thinking in light of new evidence. An exemplary problem solver isn’t one who has the good instinct to pick great priors but one who can continually check her prior and adapt when new evidence warrants it. You must have the courage to say “ok, I thought I had a good prior, but evidence dictates thatI change my mind. I was wrong. Great, I’ve learned something! Let’s integrate this information and move on.” This isn’t easy, but it’s liberating to frame it in a positive light: either I’m right or I learn.

This evidence-based approach, in turn, is best achieved not by aiming at confirming your hypotheses but by aiming at shooting them down, looking for disconfirming evidence. Citing Canadian philosopher Paul Thagard, in the end, “an explanatory hypothesis is accepted if it coheres better overall than its competitors.”

Our intuitive powers are outstanding in some respect but incredibly limited in others. Sometimes they’re right, sometimes they aren’t. You can’t really control them, but you should control how you use them, especially when dealing with unknown quantities. So check your thinking periodically and adopt a fair amount of skepticism towards your intuition.


Chevallier, A. (2016). Strategic Thinking in Complex Problem Solving. Oxford, UK, Oxford University Press, pp. 100–102.

Fischhoff, B. and C. Chauvin (2011). Intelligence analysis: Behavioral and social scientific foundations, National Academies Press.

Thagard, P. (1989). “Explanatory coherence.” Behavioral and brain sciences 12(3): 435-502.

Image credit: Pixabay.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.