2. Diagnose the problem.3. Identify solutions.

In the right setting, trust experts

By September 1, 2012 No Comments

You might have heard that Lance Armstrong is facing some doping charges. Last week, a number of organizations, like ESPN or the LA Times ran polls to know if people thought he had been doping. The results were pretty evenly balanced: in ESPN’s 53% thought that he had doped. Yet go ask a population that knows about the subject a lot more than the average ESPN watcher—say, by visiting a cycling forum, reading a blogger’s evidence-based investigative report, or listening to a physiologist and doping expert—and the opinion is much more polarized. Why is that?

Solving problems in groups is good…

If you’ve been around this site, you know that there are a few ideas central to our problem-solving approach. Two of those are engaging others and discarding experts.

So you might be surprised by the title of this post, about trusting experts. Allow me to explain.

I’m a big fan of solving problems in groups. That’s because, usually, groups are smarter than individuals. I’ve witnessed this many times with my M&Ms counting experiment.

The M&Ms counting experiment is simple… and it isn’t mine. I’ve stolen it from professor Jack Treynor (as described in Surowiecki’s Wisdom of Crowds). It goes like this: take a group of, say, more than 15 people. Show them a jar full of M&Ms. Ask them to estimate the number of M&Ms in it and to write down their estimate on an individual piece of paper. Then gather the estimates and average them. Chances are, the group’s estimate is closer to the actual number of M&Ms in the jar than any individual estimate. I’ve done it many times, and it works almost all the time!

… but sometimes it’s better to rely on just a few experts

But what if in this group you have a couple of M&M counting experts: guys that have spent the last 20 years looking at jars of M&Ms to estimate them? Guys that have taught themselves ways to better estimate the number of M&Ms? Surely these can do better than the average group member; they might even do better than the entire group. So you’d want to give their estimates a higher weight than others’.

That’s not always easy. So, short of that, you need to recognize situations in which it’s OK to gather votes without weighting them and situations in which you’re better off assigning weights. M&Ms counting? It’s pretty rare to have an expert in your group, so you’re probably OK with no weights.

But how about, say, finding out the reason for the crash of an airliner? It would sound pretty silly to poll a bunch of people that know nothing about aeronautics and forensic engineering about what happened and decide that whatever the majority thinks is the actual cause. Here, you’re better off with asking just a few, knowledgeable experts.

All that to say that involving groups in solving problems is good but you need to think about how you’re doing it: who you’re involving and how you’re considering their opinions. Diagnosing a specialized problem (say an airliner crash or the behavior of a sportsman) aren’t such situations. There, use experts. Unless, of course, you’re after another goal than identifying the root cause of your problem: maybe asking your constituents is a way to involve them. Maybe that drives traffic to your website and increases your advertising revenue. Those are good reasons. But let’s not confuse that with diagnosing the problem.

As for Armstrong, the good people on cycling forums are pretty unanimous that he has doped. USADA, which has been accusing him, is due to release their supporting evidence soon, so we’ll know. Do me a favor; if it turns out that he was innocent after all, just ignore this post.

Leave a Reply