The task is simple enough: I’ll give you three numbers, and you guess which rule I had in mind that applies to the set. Ready?

2 – 4 – 6

This is the initial step of Wason’s famous 1960 experiment. The next step was for the experimenter to ask the subjects to write down sets of three numbers that complied with the rule they had guessed. Then, the experimenter told them whether their sequences conformed to the rule, and if not, invited them to try again. Once they were confident that they had guessed the rule, the subjects announced it.

Only six of the original 29 subjects gave the correct rule at their first announcement. Subjects tended to identify a rule that were too narrow. They also tended to only propose sequences that were consistent with whichever rule they had identified. For instance, if a subject thought that numbers were increasing by two, they proposed confirmatory sequences—4–6– 8 or 10–12–14—as opposed to disconfirmatory ones (say, 2–3–4 or 7–54–5).

Meet confirmation bias

Confirmation bias—defined as seeking and interpreting evidence partially so as to support one’s beliefs—is so destructive that it can render your analysis useless. And chances are, you are a victim of it, as we all are.

Tufts University’s Raymond Nickerson found evidence of confirmation bias in numerous disciplines. (His 1998 paper is both enlightening and sobering: citing hundreds of sources, Nickerson makes a compelling argument that confirmation bias is everywhere.)

The bias may occur in various ways (see Brest and Hamilton Krieger, pp. 279–280):

  • Searching for evidence that supports your hypothesis for instance by not seeking to oppose your favored hypothesis
  • Interpreting selectively what the evidence says about your hypothesis
  • Failing to sufficiently update your thinking in light of new evidence
  • Failing to generate alternative hypotheses

As Klayman and Ha point out, “people tend to discredit or reinterpret information counter to a hypothesis they hold. ” Dawson and colleagues note that this combines with motivated reasoning, whereby we use different standards of evidence to evaluate things we like from those we dislike. That is, when evaluating a proposition we agree with, we tend to ask, “Can I believe this?” whereas when evaluating a threatening proposition, we tend to ask, “Must I believe this?”

Reduce your bias

The bad news is that confirmation bias is extremely difficult to overcome (see, for instance, Dunbar and Klahr). The good news is that … no, that’s it, there’s no good news. Only bad news: We’re deeply flawed and knowing it doesn’t improve things much.

Having said that, maybe changing our habits can help some and, hopefully, manage our biases.

Enlist others. We have a tendency to miss our own blind spots but we’re good at detecting others’ biases. Yep, it’s not just you. 🙂 So, enlist others, particularly others who have a different point of view (see Brest and Hamilton Krieger).

Consider several hypotheses. The advice to first-time home purchaser is simple: don’t fall in love with one, fall in love with three. Chamberlin and Platt have adopted the approach in the scientific method, noting that considering several hypotheses may help you not fall blindly in love with your ideas (however, they provided little empirical evidence to support their claim).

Focus on falsifying hypotheses. Hypotheses are strengthened when they survive the concerted attempts to disprove them by highly qualified people. Yet, as Bazerman and Moore note, “people naturally tend to seek information that confirms their expectations and hypotheses, even when disconfirming or falsifying information is more useful” [p. 29].

Consider the opposite. Seriously considering the possibility that the opposite of what you believe might be true helps.

Question your confidence.  We all tend to be too confident in our thinking (see here for more).

Adopt a Bayesian approach. In the end, you need to update your thinking in light of new evidence; that is, be Bayesian.

This list is by no means exhaustive, and de-biasing is a recurring theme on this site. If you have to take away just one idea, remember the advice of Nobel laureate Richard Feynman to the 1974 graduating class of Caltech: “The first principle is that you must not fool yourself—and you are the easiest person to fool.”

Oh, and by the way, Wason’s original rule in the 2–4–6 experiment was “ascending numbers.”

References

Bazerman, M. H. and D. A. Moore (2008). Judgment in managerial decision making, Wiley.

Brest, P. and L. H. Krieger (2010). Problem solving, decision making, and professional judgment: A guide for lawyers and policymakers, Oxford University Press, pp. 277–289, 618–619.

Chamberlin, T. C. (1965). “The method of multiple working hypotheses.” Science 148(3671): 754-759.

Chevallier, A. (2016). Strategic Thinking in Complex Problem Solving. Oxford, UK, Oxford University Press, pp. 104–109.

Dawson, E., et al. (2002). “Motivated Reasoning and Performance on the Wason Selection Task.” Personality and Social Psychology Bulletin 28(10): 1379-1387.

Dunbar, K. N. and D. Klahr (2012). Scientific Thinking and Reasoning. The Oxford Handbook of Thinking and Reasoning. K. J. Holyoak and R. G. Morrison. New York, Oxford University Press: 701 – 718.

Feynman, R. P. (1998). “Cargo Cult Science.” Engineering and Science 37(7): 10-13.

Kahneman, D. (2011). Thinking, fast and slow. New York, Farrar, Straus and Giroux.

Klayman, J. and Y.-W. Ha (1987). “Confirmation, disconfirmation, and information in hypothesis testing.” Psychological review 94(2): 211.

Lord, C. G., et al. (1984). “Considering the opposite: a corrective strategy for social judgment.” Journal of personality and social psychology 47(6): 1231.

Morewedge, C. K., et al. (2015). “Debiasing decisions: Improved decision making with a single training intervention.” Policy Insights from the Behavioral and Brain Sciences 2(1): 129-140.

Nickerson, R. S. (1998). “Confirmation bias: a ubiquitous phenomenon in many guises.” Review of General Psychology 2(2): 175.

O’Donohue, W. and J. A. Buchanan (2001). “The weaknesses of strong inference.” Behavior and Philosophy: 1-20.

Platt, J. R. (1964). “Strong inference.” Science 146(3642): 347-353.

Schulz-Hardt, S., et al. (2000). “Biased information search in group decision making.” Journal of personality and social psychology 78(4): 655.

Soll, J. B., et al. (2015). A user’s guide to debiasing. The Wiley Blackwell handbook of judgment and decision making. G. Keren and G. Wu.

Wason, P. C. (1960). “On the failure to eliminate hypotheses in a conceptual task.” Quarterly journal of experimental psychology 12(3): 129-140.

Image credit: TeroVesalainen.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.