Cultural Cognition Project Study Examines Why “Scientific Consensus” Fails to Create Public Consensus
Despite the emergence and widespread reporting of scientific evidence, the public is highly divided on climate change and other matters on which there is consensus or near consensus among scientists. Why?
A study of a representative sample of 1,500 members of the American public, conducted by researchers affiliated with the Cultural Cognition Project at Yale Law School and the Center for Applied Social Research at the University of Oklahoma, suggests a novel answer: people with different cultural values tend to form opposing impressions about what most scientists believe on controversial issues. The study was published online September 13 in the Journal of Risk Research. Download it here.
Funded by the National Science Foundation and the Oscar M. Ruebhausen Fund at Yale Law School, the study featured an experiment aimed at determining how members of the public form impressions of “expert scientific consensus.” In the experiment, subjects were shown the photograph of a fictional author of a book on either climate change, gun control, or nuclear power, along with his CV, which showed that he received a Ph.D. in a pertinent field from a major university, was on the faculty at another, and was a member of the National Academy of Sciences. They were then asked whether they perceived that the individual was a “knowledgeable and trustworthy” expert.
The answer, the researchers found, depended very heavily on whether the position the author took matched the subjects’ own cultural predispositions on the issue in question.
“We know from previous research,” said Dan Kahan, Elizabeth K. Dollard Professor of Law at Yale Law School and one of the researchers, “that people with individualistic values, who have a strong attachment to commerce and industry, tend to be skeptical of claimed environmental risks.”
Accordingly, eighty-six percent of persons with those values reported agreeing that the author was an expert when he was depicted as concluding that scientific evidence is inconclusive on climate change, but only 23% saw him as an expert when he was depicted as concluding that scientific evidence clearly establishes the existence, human contribution to, and serious risks posed by climate change. In the case of individuals with egalitarian values, Kahan said, who are inclined to credit environmental risks, the pattern was reversed.
The fit between a university scientist’s conclusions and subjects’ predispositions also strongly influenced subjects’ perceptions of whether a scientist was an “expert” on the other issues. Subjects with egalitarian values who are predisposed to believe that nuclear power is dangerous were substantially more likely to deem a university scientist in that field to be an ‘expert’ when he took that same position.
In addition to the experiment, the study also reported survey results showing that the American public is culturally divided in their perceptions of what most scientists believe about climate change and nuclear power, as well as the risks of laws permitting citizens to carry concealed handguns, a matter that a recent National Academy of Sciences “expert consensus” report concluded cannot confidently be resolved on the basis of existing evidence.
“No cultural group is more likely than the other to be ‘getting it right’ across issues,” said Hank Jenkins-Smith, another study researcher and Associate Director of the Center for Applied Social Research at University of Oklahoma. “National Academy of Science ‘expert consensus reports’ have consistently concluded both that climate change is a real and serious hazard—the position individualists tend to dispute—and that deep geologic isolation is a feasible and safe option for nuclear waste storage—the view egalitarians disagree with.”
“The problem isn't simply that one side ‘believes’ science and another side ‘distrusts’ it,” said Kahan. “It's as if people are doing their own intuitive surveys of scientific opinion—keeping track of what ‘experts’ they encounter believe about controversial issues—but only reliably counting as ‘experts’ those scientists who take positions those people already are inclined to believe.”
In that situation, said Kahan, “even people who agree scientific consensus should guide policy on climate change and other issues will end up divided along cultural lines, because they will form opposing perceptions of what scientific consensus is.”
”This problem won't be fixed by simply trying to increase trust in scientists or awareness of what scientists believe,” said Donald Braman, a law professor at George Washington University and the third study researcher. “To make sure people form unbiased perceptions of what scientists are discovering, it is necessary to use communication strategies that reduce the likelihood that citizens of diverse values will find scientific findings threatening to their cultural commitments.”
Citation: Cultural Cognition of Scientific Consensus, Journal of Risk Research, DOI: 10.1080/13669877.2010.511246.
The Cultural Cognition Project is an interdisciplinary group of scholars interested in studying how cultural values shape public risk perceptions and related policy beliefs. For more information, visit www.culturalcognition.net.
The Center for Applied Social Research located at the University of Oklahoma is an organization focused on applying cutting edge concepts in the social sciences to complex issues in the world today. For more information, visit http://casr.ou.edu.