Wharton professor Ethan Mollick threads that “The secret heart of academia is…Wikipedia.”
Mollick cites a new SSRN paper titled “Science is Shaped by Wikipedia: Evidence from a Randomized Control Trial,” which added new scientific content to a Wikipedia article on chemistry and tracked how the information spread. The authors (Neil C. Thompson of MIT and Douglas Hanley of the University of Pittsburgh) write:
In the months after uploading it, an average new Wikipedia article in Chemistry is read tens of thousands of times and causes changes to hundreds of related scientific journal articles. Patterns in these changes suggest that Wikipedia articles are used as review articles, summarizing an area of science and highlighting the research contributions to it. Consistent with this reference article view, we find causal evidence that when scientific articles are added as references to Wikipedia, those articles accrue more academic citations.
For Mollick, this fits a larger pattern — review articles, he tweets,
are extremely influential on the direction of scientific research…and while Wikipedia articles are generally less influential, there are more of them, they are more up-to-date, and they are free…Academics should probably be writing more reviews. Review papers help legitimate and consolidate fields of knowledge…and they also steal citation counts. Cites to reference in reviews drop, as people cite the review instead.
The thread’s reply tug-of-war fascinates me. Some academics cheer Wikipedia as a resource; others are horrified (or report their colleagues are horrified) at the prospect of relying on Wikipedia:
All of these miss the point: The Thompson and Hanley paper shows how much scientists already rely on Wikipedia entries as review articles — AKA, rely on it to understand the shape of many fields and the most important papers in those fields — and the need for what Charles Darwin called such “general and popular treatises,” which he though “are almost as important for the progress of science as the original work.”
For instance, Thompson and Hanley estimate that dissemination through Wikipedia is about 120x more cost effective at generating citations for a paper than other dissemination techniques traditionally funded by grants:
This is driven almost entirely by the extraordinary reach of Wikipedia, and thus, from a public policy perspective, funding the creation of content for accessible public repositories of science like Wikipedia is compelling. We thus encourage governments, organizations, and publically-minded individuals to incorporate the creation of such articles into their activities and applaud those who are already advocating it.
I have long argued that one of the essential forms of research-driven public expertise is the knowledge repository — a resource that contains the expert curation of the state of knowledge on questions or issues of public concern (AKA, the “what we know now approach“), and also can situate any single new piece of research relative to that state of knowledge (AKA, does this represent an advance in our knowledge or not?). On questions from COVID to nutrition to stimulus packages, we’ve sorely missed these public resources — or they’ve proven to be politicized and unreliable. Outside of headline science, science doesn’t know how to provision its informational public goods in productive ways.
The incentives and cultural norms of science (as the tweets above show) are usually misaligned with creating these repositories. Why support creating more review articles if they’re just going to steal my paper’s citations?
But if the secret heart of academe is already Wikipedia, maybe we just need to expose that secret heart more — especially the 120x more cost-effective dissemination part.