How researchers get heard
Abstract lines

Nefarious? Or Just Science?

Conservatives are biologically and neurologically different from liberals. Science says so.

If you follow politics at all in the United States, you’ll have heard that claim, and even perhaps read about some of the individual studies supporting it. Social or political conservatives, these studies have found, are more reactive to threats, more easily disgusted, more dogmatic and more receptive to authoritarian structures and leaders. (In other words, deficient in just about every quality that supports democracy.)

That phrase “these studies have found” is the rhetorical trap in which we — the public, the media, science communicators, even scientists — get stuck over and over.

The writer Jesse Singal reviewed the literature behind and about these findings two years ago for New York magazine and found them less than “found.” Now Singal reports in the British Psychological Society’s BPS Research Digest about a new Journal of Politics study that not only casts serious doubt on the claim that conservatives are more threat- or disgust-sensitive than liberals, but the very method political scientists have used to gauge that reactivity. (The study calls that method — measuring subject’s skin conductance, also known as electrodermal activity or EDA — “very unreliable” for political science research.)

Besides looking at the reliability of EDA, the researchers attempted to replicate a number of studies that have found conservative subjects more reactive to images deemed threatening (e.g., spiders, guns). They also reanalyzed the quality of those studies. They couldn’t find consistency among the EDA-measured responses to threatening or disgusting stimuli. They did find that American subjects who were conservative had heightened EDA responses to negative stimuli compared with liberal American subjects — but they didn’t find that difference among Danish participants. Social conservative subjects in both Denmark and the US did more frequently self-report feeling reactions to the threatening images. Singal concludes:

It appears that the tools used to measure fear-sensitivity in many previous studies might not be measuring that at all. This, of course, does not mean that there are no arousal-based differences between liberals and conservatives; it could be there are some good-sized and robust ones, for all we know. But the mixed results of the replication attempt suggest that even this weaker claim is far from certain. They also suggest that, while measures of automatic (or implicit) responses are very ‘in’ right now, they aren’t necessarily a better or more meaningful tool than simply asking people how they feel.

This is how science should work. Findings coalesce into models and theories; conceptual beachheads are claimed; replication of the findings leads to firming up of positions or reassessments and reconstructions or entrenchment among two or more factions. As a new paper in Nature Physics argues, failure to reproduce research doesn’t always signal a reproducibility crisis — it’s often an important part of scientific progress.

But it’s not that simple, because science is a social act.

As Singal points out in his New York piece, surveys typically find social sciences and psychology in the United States have a 10-1 or greater ratio of researchers who identify as liberals versus identify as conservatives. And these biases, of course, bleed into experimental design and overframing of results — as Singal puts it:

That’s how blind spots creep in — that’s how you keeping gauging study subjects’ “sensitivity to threat” by asking them about crime or terrorism, but rarely about climate change or right-wing police violence, and then “discover” that conservatives are more sensitive to threat. “This sort of ‘soft bias’ can be really hard to spot if most or all researchers have the same ideological outlook because it is built into people’s ideologically guided beliefs about reality,” said Yoel Inbar, a psychology researcher at the University of Toronto and a co-author of a key paper that revealed the ideological tilt within social and personality psychology. “Worrying about the threats your side cares about seems entirely well-founded and reasonable, worrying about those the other side cares about demands an explanation.”

Singal is careful how he talks about this potential bias and how it might feed into the conservative/liberal reactivity literature. He takes pains not to assert that the Journal of Politics study (or any of the other studies) has “proven” that conservatives aren’t more reactive than liberals. The best we can say right now, he says, is that serious questions have been raised, both about the strength of those original findings as well as one of the methods used to obtain them.

But that’s an ideal reading, a scientist’s reading. The rest of us aren’t scientists. And in a polarized atmosphere, it shouldn’t surprise us that, when these studies fail replication, some of us are going to see that failure as exposing a special interest, rather than “the way science works.”

Science eventually shakes out its errors (most of the time). But if we also don’t have strong, loud voices from science proactively evaluating the quality of the literature on these questions (without waiting for reporters to ask for that evaluation), science risks looking nefarious when its ordinary processes seem to expose bias or worse in papers that caught our eye.

We can’t count on science alone to do that work for us. That’s what public scholarship is for.