Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
March is Women's History Month!

Is Weak Evidence Better Than No Evidence?

In my post last week, I wrote that "weak evidence is still better than no evidence." The statement prompted some thoughtful comments from readers:

I find that weak evidence is often worse than no evidence. (Chris Harlan)

Is weak evidence a positive or a negative? Does weak evidence accomplish anything? My glass is half full on this point. (Bobbi Wilson)

Weak evidence often does something, but what that something is may be quite destructive, from the invasion of nations to needless surgeries. A lot of damage can be done by a few misplaced assumptions based on something that appears to be there, but isn't. (Chris Harlan)

The kind of situation I had in mind was one in which the evidence points to one answer over another, just not very strongly.

For example, suppose you have two identical bags full of candy. You've filled one bag with 45 caramels and 55 red candies. You've filled the other with 55 caramels and 45 green candies. But you've forgotten which bag is which, so you decide to sample a candy at random from one of the bags to attempt to figure it out.

If you happen to draw a red candy or a green candy, you're all set — you have strong evidence about which bag is which.

But what if you draw a caramel? Following normative rules of statistical inference (in this case, Bayes' Rule), you should update your beliefs in light of the new evidence (i.e., the sampled caramel) and conclude that you're slightly more likely to have drawn from the bag with 55 caramels and 45 green candies. If you want to get quantitative, the probability that you picked that bag is 55 percent, just higher than chance, which is where you started.

So in this case, it seems that weak evidence is better than no evidence — you should shift your beliefs only slightly, but doing so makes them more likely to accurately describe the world.

Once human psychology enters the mix, however, weak evidence can have surprising effects. It isn't necessarily the case that weak evidence is taken to be stronger than it should be. In fact, weak evidence for one position sometimes leads people to favor the opposite position more strongly than they should.

Consider two examples from recent research in psychology. The first, called the Faint Praise Effect, is likely familiar: In a letter of reference for a potential employee, you learn only that he has great taste in ties. When you ask how a friend's blind date went, she notes only that her companion was punctual.

Even though good taste in ties and punctuality are positive characteristics — ones that should increase your estimation of the person under consideration — these statements can have the opposite effect, leading you to conclude that the person isn't all that competent or appealing after all.

A 2013 paper by Adam Harris, Adam Corner and Ulrike Hahn finds that people's judgments follow this pattern, but also argues that it isn't irrational: if you have reason to believe that someone is knowledgeable about another person's competence or appeal, and that providing this information is relevant and appropriate in a given context, then failing to receive a positive endorsement — and instead learning something irrelevant or trivial — suggests a less-than-favorable assessment along the dimensions that matter most.

Now let's turn to the Weak Evidence Effect.

It's October of 2010 and you're asked how likely it is that Republicans will win control of the House of Representatives in the upcoming mid-term elections. You receive the following piece of (weak) evidence supporting a Republican takeover:

Ryan Frazier, a Republican candidate in Colorado's hotly contested 7th district House race won the endorsement of the Denver Post, Colorado's largest newspaper.

Will this make you more or less likely to believe that the republicans will win control of the house?

In a 2011 paper by Philip Fernbach, Adam Darlow and Steven Sloman, participants were asked questions like these, either with or without receiving the weak piece of evidence. Although an independent group of participants agreed that the weak evidence supported the possibility of a Republican win, those in the study who received the weak evidence thought a Republican win less likely (a rating of 56.3 out of 100) than those who didn't receive it (a rating of 64.8 out of 100).

Based on the findings from this study and others, the researchers suggest that highlighting a single factor (e.g., a newspaper endorsement) that only weakly predicts some outcome (e.g., a Republican takeover) leads people to focus too much on that factor at the expense of all of the others that could contribute to the outcome in question. By failing to take alternative factors into account, participants who receive weak evidence underestimate the probability of the outcome being evaluated.

The Faint Praise Effect and the Weak Evidence Effect are two examples among many. They reveal that sometimes weak evidence is worse than no evidence, at least when it comes to human judgments and the complex conditions under which they typically occur.


You can keep up with more of what Tania Lombrozo is thinking on Twitter: @TaniaLombrozo

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tania Lombrozo is a contributor to the NPR blog 13.7: Cosmos & Culture. She is a professor of psychology at the University of California, Berkeley, as well as an affiliate of the Department of Philosophy and a member of the Institute for Cognitive and Brain Sciences. Lombrozo directs the Concepts and Cognition Lab, where she and her students study aspects of human cognition at the intersection of philosophy and psychology, including the drive to explain and its relationship to understanding, various aspects of causal and moral reasoning and all kinds of learning.