How researchers get heard
Abstract lines

Debunking the Backfire Effect

A new report out from the UK fact-checking org Full Fact says that the backfire effect (that is, when “a factual claim reinforces someone’s ideological beliefs, telling them that the claim is wrong can actually make them believe the claim more strongly rather than less”) is “rare and not the norm.”

Full Fact goes on to say that “the most recent studies now suggest that generally debunks (attempts to correct factual errors) can make beliefs in specific claims more accurate.”

Since I referenced the backfire effect in two recent posts (here and here), I wanted to share this report with you and talk about some implications for research communications.

It’s been widely promoted as a “debunking of the backfire effect.”

I think its conclusions are far more limited — and that people communicating about research need to remain cautious about lapsing into trying to fill information deficits with willy-nilly debunking, especially in situations where their audience’s reception to science or research is highly polarized.

For the report, Full Fact looked at two of the studies that established the backfire effect and five more recent studies that haven’t found any evidence for it.

Some important caveats:

  • The report’s conclusions are limited to debunking of political misperceptions.
  • Its authors admit that, “beyond experimental settings, we know that both in the UK and worldwide there are specific inaccurate claims which are still believed by many, despite widespread statements to the contrary.” For example: vaccines and autism.
  • And even when debunking specific misperceptions can change beliefs about those claims, the debunking seems to have less impact on underlying attitudes that inform the misperception.

For instance, a forthcoming study in the journal Political Behavior by Nyhan et al. shows that, while debunking the 2016 claim Donald Trump made that US violent crime had increased was effective among a group of his supporters, the supporters who heard the debunk were also less likely to view the statistics as accurate, “especially when the experiment said they had been questioned by a Trump staffer.”

Bottom line: While the evidence for a backfire effect for political debunking might now be on much shakier ground, the research still doesn’t tell us how to correct misperceptions or myths effectively much less their underlying attitudes.

To that end, the Full Fact report quotes a Psychological Science article by Chan et al. (2017): “Mounting evidence suggests that the process of correcting misinformation is complex and remains incompletely understood.”

And as if to drive home the complexity, that study has a very poorly written set of recommendations that ask communicators to [paraphrasing here a bit through ellipsis]:

  1. “Communicate in ways that reduce thoughts in support of the misinformation”;
  2. “Create conditions that facilitate scrutiny and counterarguing of misinformation”; and
  3. “Correct misinformation with new detailed information…” instead of “labeling the misinformation as wrong.”

Got that? Don’t make them think of the myth, but somehow provoke them to scrutinize it without labeling it as wrong.

If you’re confused, come sit by me.

If you’re not, go on Twitter and be loud about how it’s safe again to tell people they’re wrong.