For years, the U.S. National Academy of Sciences (NAS) has released expert consensus reports confirming the reality of global warming and the safety of disposing nuclear wastes deep underground. However, intense debate still persists over these and many other issues scientifically proven and reported by the NAS. The arguments do not revolve around criticizing scientists and their evidence, so the main problem is not actually a lack of faith in the scientic method. Rather, people on both sides of these debates believe that the science supports their side. This phenomenon may be due to “cultural cognition,” and Yale Researchers at the Cultural Cognition Project (CCP) have been studying cultural cognition in a variety of topics, ranging from nanotechnology to gay and lesbian parenting to adjudicatory fact-findings. They aim to improve our understanding of how cultural values influence the perception of risk and the potent effect that this phenomenon can have on public policy.
What is Cultural Cognition?
“Cultural cognition refers to the tendency of people to fit their perceptions of risk and related facts to their group commitments,” says Dan Kahan, the Elizabeth K. Dollard Professor of Law at Yale Law school and a CCP researcher. Kahan expplains that people tend to accept behavior they and their peers deem honorable and good for society while rejecting behavior they deem dishonorable. Researchers in the CCP measure people’s “worldviews” along the two dimensions of hierarchy-egalitarianism and communitarian-individualism. This framework relates to the theory of anthropologist Mary Douglas, the originator of “the cultural theory of risk.” The theory postulates that people’s perceptions of risk should reflect and reinforce the combinations of values defined by the intersection of these two “worldview” dimensions. For example, hierarchical-individualists might be skeptical of environmental risk and have fewer qualms about nuclear power. They value markets and commerce and oppose restricting these activities. On the other hand, egalitarian-communitarians are ambivalent about markets and commerce, which they blame for social inequity, so they are more likely to accept claims of environmental risk.
The CCP identified two main psychological processes that create cultural polarization over scientific information. The first is culturally biased information searching: people prefer to look for scientific information that supports, rather than opposes, their cultural predispositions. The CCP study of nanotechnology risk perceptions in Nature Nanotechnology concluded that biased searching explains why public familiarity with nanotechnology correlates with positive views of its benefits relative to its risks. The small proportion of people aware of nanotechnology disproportionately consists of pro-technology, hierarchical-individualists who seek out information that portrays nanotechnology positively.
The second mechanism is culturally biased assimilation. When considering information from any source, people selectively credit or discredit in accordance with their cultural predispositions, becoming more polarized as they learn more. Ideally, people would evaluate all sorts of new information and update their beliefs appropriately. Under these circumstances, the increased dissemination of sound information would lead to consensus on the best available scientific evidence. However, because of cultural cognition, this does not actually happen. Instead, people with opposing values start with opposing beliefs about controversial issues. They search for and interpret information in biased, opposing patterns, ultimately resulting in persistent polarization on risk issues.
Why does scientific consensus fail to address cultural polarization? Indeed, public disagreement in the face of scientific consensus is rare. “The number of findings that people believe and find completely unremarkable exceeds by orders of magnitude the number in which they end up deeply polarized,” Kahan remarks. A particularly illustrative example is the contrast between the vaccines for H1N1 and for HPV. Few doubt the need and benefit of the flu vaccine and most other vaccines, but people are still divided over the HPV vaccine. The project seeks to understand, anticipate, and treat such “pathologies,” as Kahan calls them.
A recent CCP study published in the Journal of Risk Research tackles this issue by examining perceptions of scientific consensus on three issues: anthropogenic climate change, the safety of deep geological nuclear waste storage, and the effect of permitting carrying concealed guns on crime rate. These topics were chosen because expert consensus reports were available and previous work had already established that people have polarized cultural predispositions toward them. The NAS definitively answered yes to the first two questions and deemed the evidence inconclusive for the third.
In the study, subjects were presented with a picture of a scientist and a mock CV, after their cultural values were measured. They were then asked to determine if the scientist was an “expert” on a particular risk issue. The scientists’ credentials were kept con¬stant: they were all professors at major research institutions, with Ph.D.’s from prestigious universities with membership to the National Academy of Sciences. Half the subjects were shown an excerpt from the scientist’s writing defending the “low risk” position on one of the three issues, while the other half were shown excerpts defending the “high risk” position.
The researchers saw a significant shift in response based on the relationship between the cultural predispositions of the subjects and the portrayed positions of the scientists. For example, when the scientists held the “low risk” position on climate change, hierarchical-individualists were highly likely, and egalitarian-communitarians highly unlikely, to rate the scientist an expert. The same occurred with assessments on the other issues.
This result reflects biased assimilation as subjects selectively credited or discredited the scientists’ expertise to reinforce their own cultural predispositions. If the same phenomenon occurred outside of the lab, we could expect people to form opposing impressions of what “most scientists” believe on various issues. The same study also found that both hierarchal-individualists and egalitarian-communitarians tended to believe that scientific consensus was on their side on the issues studied. However, when compared to the NAS expert consensus reports, neither group of subjects were more likely to be correct about what “most expert scientists believe” across the three issues.
Future of Scientific Communication
Now that we have a better idea of the mechanisms behind cultural cognition, the next step is to solve the initial problem by ending debates that should not be happening in the first place.
Kahan says that prevention is key because issues are not born with meaning. Rather, they acquire significance when we present them. Issues can become too firmly associated with different groups, thus catalyzing polarization. The controversy over HPV erupted because a message associating the vaccine with casual sex targeted only young girls. Kahan believes that the campaign should have focused on all children, especially given that boys are also at risk, and that the message should have been worded so as not to threaten any cultural values.
Another study Kahan conducted suggests that public communication should use culturally diverse faces because extreme polarization occurs when people are given a message by someone who looks like them. Presentation of either the same message by someone of a different group or an opposite message by someone of the same group reduced the degree of polarization.
“People are generally very good at figuring out whom to trust,” Kahan says. “There is just something about these issues that are making people confused.” The underlying issue of cultural cognition needs attention so that we can deal with possibly polarizing issues. Kahan suggests that there should be a science of science communication, and programs should have an agency cognizant of this issue.
We have already heavily invested in science research, so it only makes sense to invest in effective methods of communicating it. Scientific discoveries are ignored if people are polarized by their presentation. Given our understanding of cultural cognition, we need to work towards creating an environment in which people will not simply cling to their biases, but reach a consensus based on strong scientific evidence.
About the Author
Nancy Huynh is a sophomore Molecular, Cellular, and Developmental Biology major in Silliman. She wishes people could come to a consensus on scientific consensus already.
The author would like to thank Professor Kahan for taking the time to so thoroughly and enthusiastically explain his research as well as for his sense of humor.
Dake, K. (1991). Orienting Dispositions in the Perception of Risk: An analysis of contemporary worldviews and cultural biases. J. Cross-Cultural Psych., 22, 61-82.