List member Stephen Wood, an applied scientist at The Nature Conservancy, sent this response to yesterday’s email (thanks to Stephen for allowing me to quote him):
Your post touches on an issue we struggle with a science group I lead: there’s lots of excitement about [what we’re working on], but many of the practices that people are excited about aren’t substantiated by much evidence. As scientists, we’re then left in a space of saying “no, no, no” almost by definition because people doing this work are innovators and innovation/experimentation is usually out in front of peer-reviewed science (whether that innovation ends up being substantiated or not in the long run). But we don’t want to stifle excitement in our area of work and remove ourselves from the table by developing a reputation as nay-sayers; we want to support action, just with credible evidence.
Help: how do we stand up for the credibility of evidence and the scientific process without just nay-saying?
I suggested a couple of resources to start with:
- Cook and Lewandowsky’s The Debunking Handbook is the foundational primer;
- This HBR article on debunking myths about vaccines provides a good overview of the pioneering research on best debunking practices;
- Nyhan and Reifler (2017) present evidence that graphical representations of scientific information are more effective than textual representations in reducing misperceptions.
I’d also add Peter and Koch (2016), who found that getting people to articulate their opinions of new information as they read it was useful in defeating the backfire effect.
The backfire effect (in which debunking a myth actually reinforces it) has itself been walked back a bit recently. For instance, Ecker, Hogan and Lewandowsky (2017) found that retracting misinformation was more effective when the misinformation was explicitly repeated. Wood and Porter (2016) were unable to replicate some foundational research on the effect.
However, Nyhan and Reifler (2015) found that, while telling people that “the flu vaccine will not give you the flu” significantly reduced their belief in the opposite, it also significantly reduced their intent to get vaccinated. The literature about a backfire effect when discussing vaccinations remains strong, and Lewandowski has not backed away from The Debunking Handbook’s recommendations.
The way to think about it, Wood, Porter and Nyhan suggested on a “You Are Not So Smart” podcast episode from 2018, is that even when we focus on changing people’s factual beliefs, we’re not necessarily changing their opinions. As Nyhan put it:
“People often focus on changing factual beliefs with the assumption that it will have consequences for the opinions people hold, or the policy preferences that they have, but we know from lots of social science research…that people can change their factual beliefs and it may not have an effect on their opinions at all.”
“The fundamental misconception here is that people use facts to form opinions and in practice that’s not how we tend to do it as human beings. Often we are marshaling facts to defend a particular opinion that we hold and we may be willing to discard a particular factual belief without actually revising the opinion that we’re using it to justify.”
Hat tip: Work-Learning Research
This analysis fits the scenario Stephen mentions, I think: people in the field managing soil for carbon sequestration, but with practices that haven’t yet been scientifically proven to be effective — and might never be. In these cases, science often serves as buzzkill, as one of my old friends puts it. Practitioners have passionate beliefs, and persuading them is going to be very difficult. It’s also a situation in which researchers can very easily slip into the role of outsider, lecturing the community they’re trying to communicate with — and thus lose the trust that’s fundamental to that communication.
Scientists and researchers are often up against what Sobo et al (2016) called “Pinterest thinking” in their audiences — the cherry-picked curation of facts, anecdotes, opinion and ideology that comprise knowledge and reflect identity. We all use “Pinterest thinking” in various arenas of our lives — but science seems particularly ill-equipped to combat it because it’s built to be a privileged narrative, not one among many narratives.
I’d still recommend following the main recommendations out of The Debunking Handbook: staying positive, staying on message and keeping those messages short and memorable, not denigrating your audiences beliefs, and depicting the evidence graphically where possible. And adding two more: Staying part of the community you’re communicating with; and accepting that, despite all this, despite your best efforts, you might not succeed.
There are a number of science communicators on this list. I wonder what they think.