People who trust science are more likely to believe and disseminate false claims that contain scientific references than are people who don’t trust science, a study finds.
Reminding people of the value of critical evaluation reduces belief in false claims, but reminding them of the value of trusting science does not.
“The critical mindset makes you less gullible.”
“We conclude that trust in science, although desirable in many ways, makes people vulnerable to pseudoscience,” the researchers write. “These findings have implications for science broadly and the application of psychological science to curbing misinformation during the COVID-19 pandemic.”
“People are susceptible to being deceived by the trappings of science,” says coauthor Dolores Albarracín, professor at the University of Pennsylvania. She says, for example, that COVID-19 vaccines have been the target of false claims that they contain pollutants or other dangerous ingredients. “It’s deception but it’s pretending to be scientific. So people who are taught to trust science and normally do trust science can be fooled as well.”
Albarracín, a social psychologist and director of the Science of Science Communication Division of the Annenberg Public Policy Center of the University of Pennsylvania, says, “What we need are people who also can be critical of information. A critical mindset can make you less gullible and make you less likely to believe in conspiracy theories.”
The study, which Albarracín and colleagues conducted when she was in her former position at the University of Illinois at Urbana-Champaign, appears in the Journal of Experimental Social Psychology.
The ‘critical evaluation’ mindset
For the study, researchers conducted four preregistered experiments with online participants. The researchers created two fictitious stories—one about a virus created as a bioweapon, mirroring claims about the novel coronavirus that causes COVID-19, and the other about an unsubstantiated conspiracy theory about the effects of genetically modified organisms or GMOs on tumors.
The invented stories contained references to either scientific concepts and scientists who claimed to have done research on the topic or descriptions from people identified as activists. Participants in each experiment, ranging from 382 to 605 people, were randomly assigned to read either the scientific or non-scientific versions of the stories.
What the researchers found was that among people who did not have trust in science, the presence of scientific content in a story did not have a significant effect. But people who did have higher levels of trust in science were more likely to believe the stories with scientific content and more likely to disseminate them.
In the fourth experiment, participants were prompted to have either a “trust in science” or a “critical evaluation” mindset. Those primed to have a critical mindset were less likely to believe the stories, whether or not the stories used seemingly scientific references. “The critical mindset makes you less gullible, regardless of the information type,” Albarracín says.
“People need to understand how science operates and how science arrives at its conclusions,” Albarracín adds. “People can be taught what sources of information to trust and how to validate that information. It’s not just a case of trusting science, but having the ability to be more critical and understand how to double-check what information is really about.”
Trust in science with ‘healthy skepticism’
The study’s lead author, postdoctoral researcher Thomas C. O’Brien of the University of Illinois at Urbana-Champaign, adds, “Although trust in science has important societal benefits, it is not a panacea that will protect people against misinformation. Spreaders of misinformation commonly reference science. Science communication cannot simply urge people to trust anything that references science, and instead should encourage people to learn about scientific methods and ways to critically engage with issues that involve scientific content.”
The researchers conclude: “Although cynicism of science could have disastrous impacts, our results suggest that advocacy for trusting science must go beyond scientific labels, to focus on specific issues, critical evaluation, and the presence of consensus among several scientists…
“Fostering trust in the ‘healthy skepticism’ inherent to the scientific process may also be a critical element of protecting against misinformation… Empowering people with knowledge about the scientific validation process and the motivation to be critical and curious may give audiences the resources they need to dismiss fringe but dangerous pseudoscience.”
The National Science Foundation, the National Institute of Drug Abuse, and the National Institutes of Health supported the work.