How researchers get heard

Three Overlooked Elements of Effective Public Expertise

I keep coming back to a point Gemma Derrick alluded to during my recent podcast interview with her on “grimpact”: The idea of “science communication” contains a structural incoherency, in that science traditionally offers a) caveated “yes but” answers instead of the b) definitive “yes or no” answers most non-scientists crave (and, let’s face it, need).

That inability to make a market, if you will, between what science offers and what people desire explains in part why we see so many overframed papers that can’t support the headlines they generate. Scientists want to meet people’s desires, but individual papers are rarely designed to give us definitive yes or no answers.

I also think that incoherency or friction is why we need more researchers as public experts — translators who can summarize the state of the research and apply their evidence-based expertise to problems we care about, in ways that get closer to “yes or no.”

Friend of the list Jon Fisheroffers the below perspective (reprinted with permission):

I heard a really good interview this spring with a couple of virologists with similar questions “should I do X” and they did a really good job of answering using a format of “That kind of activity is [high|medium|low |very low] risk [for these reasons]. I personally [am | am not] doing that [while conditions are X], and I would encourage others to.” So they communicate the level of risk (nonbinary), explain their personal choice and why they made it, and make a recommendation. The thing I liked is that they didn’t go to the scientists’ beloved “it depends” — they still provided the binary answer but with the details that let the person asking understand how to interpret it.

In one case they disagreed: the question was about whether vaccinated people w/ unvaccinated teenagers should hang out indoors w/ a few vaccinated friends. The first person said that was very low risk so they would be comfortable doing it. The second person agreed it was low risk, but then disagreed by saying that we were so close to having vaccination available for teenagers, it made sense to wait a few more months to eliminate that low risk entirely.

Ask most researchers how they empower others and they will probably say through information: “You didn’t know this thing before I told you, and now that I’ve told you about this finding you can adjust your thinking and behavior.” (“To act more rationally” is the unspoken coda of that sentence.)

Provision of exceptional, evidence-based information: That’s expertise, isn’t it?

But the virologists Jon cites offer a different kind of empowerment — empowerment through transparency of knowledge process, personal disclosure and respecting choice.

They’re transparent about the criteria they use to make their risk assessments. They’re transparent about how they’re applying those assessment in their own lives. And they understand that decisions around risk are ultimately not one-size-fits-all but personal choices made at the intersection of risk perception and risk tolerance.

This package, I would argue, exemplifies effective public expertise today. It avoids, as Jon says, the dreaded conventional scientific response of “it depends.” It gives the binary answers we crave and need. It shows us how experts put those answers into practice themselves — they demonstrate that they aren’t hypocrites, just in it for the funding. And it gives us enough information to disagree with them, should our risk tolerance (or our criteria for evidence) be different.

Offer answers without caveat. Show us how you arrived at them and how you use them. Invite us to do the same.

This seems a fertile path forward.