Page 1 of 712345...Last »

Use an assertion-evidence structure

Many PowerPoint presentations are dreadful. But that doesn’t mean that yours have to be. Capturing ideas in the taglines of slides can go a long way towards improving the quality of your presentations.

Yale’s Edward Tufte, a preeminent specialist in data visualization, vehemently criticized PowerPoint presentations, noting that it “promotes a cognitive style that disrupts and trivializes evidence” (Tufte, 2003). From experience, thinking about the dozens of presentations I sat in over the past few months, I agree that most weren’t optimally visually supporting the presenter’s point. Yet, that doesn’t mean we are doomed.

Related Posts:

Manage your confirmation bias

Confirmation bias—seeking and interpreting evidence partially so as to support one’s beliefs—is so destructive that it can render your analysis useless. And chances are, you are a victim of it.

Raymond Nickerson, a psychology professor at Tufts University, found evidence of confirmation bias in a number of disciplines. (His 1998 paper published in the Review of General Psychology is both enlightening and sobering: citing hundreds of sources, he makes a compelling argument that confirmation bias is everywhere.)

Related Posts:

Conduct the right analysis

Having identified a set of hypotheses you need to test them. To do so, you need to conduct tests that can help you rule out some of the hypotheses. Sounds obvious, right?

Unfortunately, conducting the right analysis is not always (or perhaps even hardly ever) our preferred way of proceeding. In a classic article published in 1964 in Science, John Platt made the case that Pasteur was able to have a prolific career in a number of unrelated fields because he was particularly good at asking the right questions and conducting the right analysis to test his hypotheses. Pasteur shined because his competitors weren’t as skilled.

Related Posts:

Use inductive, deductive, and abductive logic

Most of us have heard of inductive and deductive logic. Fewer have heard about abductive, and yet, all three are needed to solve complex problems effectively.

Related Posts:

Did Chris Froome dope to win the 2013 Tour de France? Part 3 – Conclusion

In our previous two posts (1 and 2), we talked about Froome’s dominant victory in the 2013 Tour de France to see if we could diagnose it: was his a fair victory, or did he cheat?

I think that he cheated, and this post explains how I got to that conclusion.

Our diagnostic strategy involves only a few steps:

  1. Break down the key question into its parts, and summarize the result in a set of ICE hypotheses,

Related Posts:

Did Chris Froome dope to win the 2013 Tour de France? Part 2

Last week, we talked about how we can evaluate whether Chris Froome doped to win the 2013 Tour de France. Let’s retake the issue where we left it.

Building the diagnostic map

With our problem-solving approach, we use a diagnostic map (or a diagnostic issue tree) to break down the stem question into its various components before summarizing those in a set of independent and collectively exhaustive (ICE) hypotheses. In this case we came up with 3 hypotheses:

Related Posts:

Did Chris Froome dope to win the 2013 Tour de France? Part 1

If you go on any cycling-related forum these days, there is a raging controversy: Did Chris Froome dope to win the 2013 Tour de France?

The arguments on both sides echo those that we’ve heard for all doping cases. Also pretty common is how the conversation tends to be argumentative in the wrong way: “He didn’t dope because X”, “Yes he did because Y”, “This is ridiculous, you’re an idiot.”, and so on until the whole conversation becomes an exercise in name calling with little substance about the original case.

Related Posts:

Look for confirming evidence too

A few weeks ago, we talked about the importance of looking for disconfirming evidence when testing hypotheses. Indeed, this is vital, but, in some situations, supporting evidence might actually be the one helping you reach a solid conclusion. So, don’t discard it.

In some settings, disconfirming evidence is inconclusive

Bazerman and Neale, which we discussed in our previous post, reported Wason’s work that disconfirming evidence was the critical one. Wason’s work has been hugely influential over the past half century, in part because it came roughly at the same time as Popper’s approach to evidence. In the 80s, Klayman and Ha used it to shed light on what is meant by “looking for disconfirming evidence:”

Related Posts:

Use a MECE structure but let your ideas be ICE

We’ve talked a few times about being mutually exclusive and collectively exhaustive (or MECE) in your thinking:

First, we talked about how MECE thinking is useful because it helps ensure that your approach has no overlaps (ME) and no gaps (CE). Then we looked at ways to be more MECE in your thinking by being CEME. And we’ve also addressed the fundamental issue of MECE thinking in problem solving: that your true intent is not to find solutions that are truly mutually exclusive but rather independent because being ME requires a preclusion. So, rather, we introduced the idea that you should think about being ICE (independent and collectively exhaustive) instead. This seems to be confusing, so let’s see if we can clarify the whole thing.

Related Posts:

Don’t trust your intuition

A few month ago, we talked about how intuition can be misleading. In brief, we all tend to trust our intuition more than we should. All of us. Yes, that includes you.

Not convinced?

Here is a little exercise to see how well you are doing (again, this is stolen from Bazerman and Neale, p.56-58):

Related Posts:

Page 1 of 712345...Last »