Bridging the gaps in what people think they know

Bridging the gaps in what people think they know

Strong, misinformed views on science can’t be corrected with simple facts. Not when people are ‘confident’ in what they think they know.

Climate scientists and policy advocates often lament the disdain and denial they encounter during science communication efforts meant to engage the public. It may come as no surprise that, according to British researchers, the reason for why people hold strong negative attitudes about science, in spite of relevant evidence, is due to overconfidence in what they think they know.

“Those whose self-assessed understanding exceeds their factual knowledge are more prone to negative appraisals of science,” say the authors of a study published this week in the journal PLOS Biology.

The connection between the two suggests that it’s not so much the unknown, but rather a fear, disgust, or distrust of what they think they know, that may drive them to hold their position on climate change, vaccine safety and efficacy, or the benefits and risks of genetically modified foods. Nor can the discrepancy be attributed entirely to Dunning–Kruger type effects that suggest those who are least competent lack also the ability to understand their limitations.

The British scientists, including experts from University of Oxford, University of Aberdeen, the  University of Bath and The Genetics Society in London, based their study on surveys with more than 2,000 people across the United Kingdom.

The researchers asked questions about how much a person believed they understand about a topic or finding, in this case on genetic science. Participants self-rated their understanding, with those who were either strongly supportive or strongly anti-science on an issue having high levels of trust in their own understanding. But those who were more neutral on the science rated themselves as being less knowledgeable about it.

“As one tends to higher degrees of subjective understanding, so attitudinal positions become more extreme,” the authors note. “This supports the hypothesis that people with more extreme attitudes are more confident that they understand the science.”

The researchers say it makes sense that someone with a strong opinion needs to believe that they have command over correct facts to support it. And it’s also true that well-informed people who understand science also accept its findings for much the same reason.

“Strong attitudes, both for and against, are underpinned by strong self confidence in knowledge about science,” says Laurence Hurst, director of the Milner Centre for Evolution at the University of Bath.

But “correcting” strong but misinformed positions isn’t as simple as providing facts, and that’s a critical issue in science communication when seeking to change public opinion on climate change and other divisive science issues.

“Traditionally, it was thought that what mattered most for scientific literacy was increasing scientific knowledge. Therefore, science communication focused on passing information from scientists to the public,” said Alison Woollard of the University of Oxford.

“These results, however, suggest that this approach may not be successful and may in some cases backfire. Working to address the discrepancies between what people know and what they believe they know may be a better strategy.”

The research acknowledges that religious and political views help to shape opinions about science, and whether people trust science news or consider it hype, a conspiracy, or otherwise invalid.

“Confronting negative attitudes towards science held by some people will likely involve deconstructing what they think they know about science and replacing it with more accurate understanding,” says study co-author Anne Ferguson-Smith, president of the Genetics Society.

“This is quite challenging.”

This story first appeared on Sustainability Times


 

© 2023 Sustainability Times.

This article is licensed under a Creative Commons Attribution-ShareAlike 4.0 SA International License.