It’s very easy to assume that people who don’t make smart decisions on risks are – not to put too fine a point on it – stupid (“smart” here usually meaning “the decision I think is right”). But as many researchers in the decision analysis field will attest to, it’s more complicated than that. This was highlighted in a paper in Nature Climate this week by Dan Kahan and colleagues, where they presented evidence that greater polarization on climate change is found amongst people who display a higher comprehension of science. Although the paper focuses specifically on climate change, the findings are directly relevant to communicating and engaging on risk more generally.
Dan very kindly has allowed us to cross-post a piece he wrote on the paper on the Cultural Cognition Project blog. In addition, the Knight Science Journalism Tracker has a good roundup of media coverage of the paper this week – which has been extensive.
Our study on the effects of science literacy and numeracy on climate change risk perceptions is now out in Nature Climate Change. We find that individuals who display high comprehension of science (i.e., those who score higher in science literacy and numeracy) are in fact more culturally polarized than those who display low science comprehension.
I’ve commented before on how these data relate to the popular surmise that seeming public ambivalence toward evidence on climate change reflects the predominance of what Kahneman (in his outstanding book Thinking: Fast & Slow, among other places) calls “system 1” reasoning (emotional, unconscious, error-prone) on the part of members of the public.
Our findings don’t fit that popular hypothesis. On the contrary, they show that individuals disposed to use system 2—conscious, reflective, deductive—reasoning (a disposition measured by the numeracy scale) are even more culturally divided than those disposed to use system 1.
The interesting thing is that Kahneman himself recognized just last week that system 2 as well as system 1 might be implicated in climate change conflict.
In his Sackler Lecture (strongly recommended viewing) at the National Academy of Science’s Science of Science Communication Colloquium (say that three times fast), Kahneman explicitly commented on the connection between his theory of dual process reasoning and cultural cognition.
He recognized that one would expect, consistent with System 1, that ordinary members of the public would fit their perceptions of climate change risk to emotional resonances, which themselves might vary systematically across persons with diverse values.
At the same time, however, Kahneman argued against assuming system 2 would sort this disagreement out. Often “system 2 is just the spokesperson for system 1,” he said. In other words, people are likely to recruit their systematic, “slow” reasoning skills when necessary to reach the conclusion they prefer and not rely only on “fast” heuristic ones.
The point of the study, in fact, was to test pit two plausible alternative hypotheses about cultural cognition and dual process reasoning against one another.
One attributes the influence of cultural values on risk perception to system 1, viewing cultural cognition as essentially a heuristic substitute for the ability to comprehend complicated scientific evidence. Our findings (including the absence of any overall connection between science literacy and climate change concern) undermine that view.
The other hypothesis views cultural cognition as a species of motivated reasoning that is as likely to shape system 2 as well as system 1. Our finding of increased polarization among the most science comprehending members of our sample lends support to this position.
In the paper, we suggest that the alliance between cultural cognition and system 2 is actually perfectly rational at an individual level. Ordinary members of the public can’t have a much bigger stake in forming views that match those of their peers on controversial issues than they do in getting the science right on climate change: making a mistake on the latter has zero impact on the risks they face (nothing they do as individual voters or consumers matters enough to make a difference) but screwing up the former can result in their being shunned by people whose emotional and material support they covet.
So everyone tries to fit the evidence to positions that predominate in his or her group. And those who know a lot of science and are good at technical reasoning do an even better job.
The result is a tragedy – of the risk perceptions commons—and it occurs whether people reason “fast” or “slow.”
Still, once we have determined through systematic thought and actual evidence that system 1 alone is not to blame, we can then turn to identifying (again, through empirical testing; creative guessing is good only for hypotheses) what sorts of communication strategies might enable culturally diverse citizens to use their reasoning in a manner that benefits them all.