Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
March is Women's History Month!

Explain This: The Illusion Of Political Understanding

Should the United States impose unilateral sanctions on Iran for its nuclear program? Should we raise the retirement age for Social Security? Should we institute a national flat tax? How about implementing merit-based pay for teachers? Or establishing a cap-and-trade system for carbon emissions?

Plenty of people have strong opinions about complex policy issues like these. But few people have the detailed knowledge of policy or economics that a solid understanding of the issues seems to require. Where do these opinions come from, if not from careful analysis and deep understanding?

A variety of uncharitable answers come to mind. Perhaps people just adopt the attitudes of their local community or favorite pundits. Perhaps people believe what they want to believe. Or perhaps people think they do understand the issues, at least well enough to support their own opinions.

A recent paper by psychologist Phil Fernbach of the Leeds School of Business at the University of Colorado and his collaborators, published this May in Psychological Science, provides some evidence for this final option: people overestimate how well they understand the mechanics of complex policies, and this sense of understanding helps bolster politically extreme positions.

The striking implication, for which the researchers find support, is that getting people to appreciate their own ignorance can be enough to rein in strong opinions.

Fernbach pointed me to the following video, which amusingly illustrates the "illusion of political understanding" that he documents in his paper. In this case, people have opinions about the fiscal cliff, and plenty are worried about it, despite having no idea what it is:

The video also illustrates the technique that Fernbach and colleagues used to get people to appreciate their own ignorance: asking them to explain an issue.

Here's how the study worked. People completed an online survey in which they first rated their agreement with several policies, such as sanctions on Iran and a cap-and-trade system for carbon emissions. They were then asked to estimate how well they felt they understood each policy and received an unexpected request: for two of the policies, they were told to "describe all the details" they knew about the impact of instituting that policy, "going from the first step to the last, and providing the causal connection between the steps."

In other words, people were asked to explain the nitty gritty mechanics of how the policy would play out, an exercise that led many to subsequently lower their estimates of how well they actually understood the policy.

Thus humbled, people's agreement or disagreement with the policy also became more moderate. More surprisingly, explaining also affected behavior: a follow-up study found that after explaining how various policies would work, people were less likely to donate money to an organization that supported the position they had originally endorsed.

These findings are remarkable given that people's opinions often become more extreme, not less extreme, when given an opportunity to reflect on them.

For example, a classic study in social psychology found that presenting people with evidence that challenged their views concerning capital punishment made them more likely to endorse their original position, not less. Similarly, Fernbach and colleagues found that asking people to "write down all the reasons you have for your position" was nothing like explanation when it came to moderating beliefs — only those who explained adopted less extreme positions.

Why was explanation so effective? In a New York Times piecediscussing this work, co-author Steven Sloman and Phil Fernbach suggest that explanation acts as "a kind of revelatory trigger mechanism" that forces people to confront their lack of understanding. When you think you understand, probe further. Ask yourself "how?" and "why?" Ask others the same.

These findings have important implications for political discourse. In their New York Times post, Sloman and Fernbach offer the following lessons:

We voters need to be more mindful that issues are complicated and challenge ourselves to break down the policy proposals on both sides into their component parts. We have to then imagine how these ideas would work in the real world — and then make a choice: to either moderate our positions on policies we don't really understand, as research suggests we will, or try to improve our understanding. Either way, discourse would then be based on information, not illusion.


You can keep up with more of what Tania Lombrozo is thinking on Twitter: @TaniaLombrozo

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tania Lombrozo is a contributor to the NPR blog 13.7: Cosmos & Culture. She is a professor of psychology at the University of California, Berkeley, as well as an affiliate of the Department of Philosophy and a member of the Institute for Cognitive and Brain Sciences. Lombrozo directs the Concepts and Cognition Lab, where she and her students study aspects of human cognition at the intersection of philosophy and psychology, including the drive to explain and its relationship to understanding, various aspects of causal and moral reasoning and all kinds of learning.