Focus your analysis on what matters

Focus your analysis on what matters

Jan 5, 2014

Having identified a set of hypotheses—either diagnostic ones, the potential root causes of your problem, or solution ones, the potential options—you need to test them. To do so, you need to conduct tests that can help you rule out some of the hypotheses. Sounds obvious, right?

Conduct a good analysis: easier said than done

Unfortunately, conducting a good analysis is not always (or perhaps even hardly ever) our preferred way of proceeding. In a classic article published in 1964 in Science, John Platt made the case that Pasteur was able to have a prolific career in a number of unrelated fields because he was particularly good at asking good questions and conducting an appropriate analysis to test his hypotheses. Pasteur shone because his competitors weren’t as skilled.To be fair, Platt didn't include empirical evidence to support this assertion, as pointed out by O'Donohue and Buchanan,* and he stopped short of establishing a causal link between considering various hypotheses and being a more productive scientist. What is beyond a doubt, though, is that we're all suffering from various biases, including confirmation bias, and we tend to search for confirming evidence when we ought to search for (primarily) disconfirming evidence.In other words, we often are unable to focus on the analysis that’s most relevant to help us update our thinking. I have seen it in managerial settings both in the corporate and in the non-profit worlds: Facing a complex issue, we easily resort to a more-evidence-is-better mindset and gather as much information as we can that is related to our hypothesis without first asking whether this evidence would help us update our thinking about the hypothesis. The problem is even more prevalent now that  information is more accessible than ever and we end up spending significant time and resources on interpreting evidence left and right before realizing that it doesn’t really help us re-assess our hypothesis.

To recalibrate, start by identifying which information is needed

Instead, we need to reverse that process. Once you have developed a hypothesis, define the specific evidence needed to test it. This is similar to our problem-framing approach using a SCQ sequence where we weed out all the superfluous material from our situation, complication, and key question. Then gather that information. If it’s not available, decide whether using a proxy is judicious but always refer to your original test to ensure that you have not moved away from relevancy.* Thanks to my colleague Phil Rosenzweig, author of the excellent The Halo effect, for pointing me to that paper.

References:

Arkes, H. R. and J. Kajdasz (2011). Intuitive theories of behavior. Intelligence analysis: Behavioral and social scientific foundations. B. Fischhoff and C. Chauvin, The National Academies Press
: 143-168.Chevallier, A. (2016). Strategic Thinking in Complex Problem Solving. Oxford, UK, Oxford University Press, pp. 14–17.O'Donohue, W. and J. A. Buchanan (2001). "The weaknesses of strong inference." Behavior and Philosophy: 1-20.Platt, J. R. (1964). “Strong inference.” Science 
146(3642): 347-353.

Rosenzweig, P. (2007). The halo effect ... and the eight other business delusions that deceive managers, Free Press.