Embrace the messiness of evidence

Embrace the messiness of evidence

Oct 12, 2019

"The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts.”

Bertrand Russell


I don’t know that I agree with Russell that this is the whole problem with the world, but I agree that it’s a critical one. Very few things are certain, but we believe those who make things appear certain. That tendency to believe those who speak with confidence is dangerous, because we have evidence showing that confidence levels and chances to be correct are essentially uncorrelated, even among experts (see Arkes and Kajdasz).


And that's not just others: We all often speak with confidence, irrespective of whether said confidence is warranted. As audiences, we take that confidence as a sign of expertise. “Well, Bob seemed pretty sure about this, and Bob wouldn’t say that if he didn’t have good reasons, whatever those may be, so Bob must be right.”


Trained scientists don’t speak with certainty


Trained scientists are a rare breed of people for various reasons. Not only do they have stylish white coats, they also predominantly speak with nuances that carefully reflect their doubts and limited understanding.


For instance, aside from pure mathematics and a handful of other disciplines, trained scientists do not talk about a body of evidence proving one thing or another. Not just because causation is difficult to establish, but because proving is ambiguous; what does "proving" mean?


In the legal world, there are various levels of proofs (or evidentiary standards). For instance, the U.S. legal system includes preponderance of the evidence on the weak side, clear and convincing evidence somewhere in the middle, and beyond a reasonable doubt on the strong side. It also has more levels in between these, and even these levels aren’t fully clear. For example, the latter standard requires that a juror has moral certainty that a person is guilty. How do we quantify this? Do we say that we are 95 percent sure, 99 percent sure, 99.99 percent sure?


Scientists, therefore, stay away from proving but talk about evidence suggesting, supporting, or indicating. These are more nuanced assertions but they reflect their less-than-absolute-certainty in their claims.


We should treat an ability to make nuanced arguments as a sign ofintellectual honesty


If you have an ideology, you formulate your answer before you get the evidence. That’s not ideal. Instead, you should keep an open mind and follow the evidence. This means that you’ll need to change your mind. For some fifty years, eating sugar was better than eating fat. Until it wasn’t. As for climate change, if it were real, why then did scientists first call it global warming?


An evolving body of evidence can get your head spinning. That’s because evidence often is incomplete (no matter how much we have, it’s not enough to give us certainty), inconclusive (the evidence is consistent with more than one hypotheses), and ambiguous (it’s unclear what the evidence is telling us), dissonant (as a whole, our body of evidence will have some supporting a given hypothesis and some opposing it). It’s unclear if it can be trusted.


But this messiness of evidence

is just a fact of life, one that we ought to accept. Instead of shooting for

absolute certainty and having a black-and-white view of the world, we’d be well

inspired to be more flexible. As Tim van Gelder phrases it, we ought to go from

Booleanism to Bayesianism. Concretely, this means starting with whatever viewpoint wethink is most likely true and updating as we uncover new evidence. Instead of taking the

messiness of evidence as a bug, we ought to treat it as a feature: It is

telling us that we shouldn’t be too sure about what we think we know.



The bottom line is, as audiences, we shouldn’t be careful about people who qualify their assurance. On the contrary, we should be careful about people who speak with certainty. And if we’re the ones making the claims, we ought to properly calibrate what we’re saying based on the evidence we have.


References


  • Arkes, H. R. and J. Kajdasz (2011). Intuitive theories of behavior. Intelligence analysis: Behavioral and social scientific foundations. B. Fischhoff and C. Chauvin, The National Academies Press
    : 143-168. (pp. 147–148)



  • Cook, John et al. Quantifying the consensus on anthropogenic global warming in the scientific literature. May 15, 2013. Accessed October 6, 2019.


  • Kolata, Gina, Eat Less Red Meat, Scientists Said. Now Some Believe That Was Bad Advice. The New York Times, September 30, 2019.


  • O’Connor, Anahad, How the Sugar Industry Shifted Blame to Fat. The New York Times, September 12, 2016.


  • Parker-Pope, Tara and O’Connor Anahad, Scientist Who Discredited Meat Guidelines Didn’t Report Past Food Industry Ties, October 4, 2019.


  • Tecuci, G., et al. (2014). "Computational approach and cognitive assistant for evidence–based reasoning in intelligence analysis." International Journal of Intelligent Defence Support Systems
    5(2): 146-172.