Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS Absolutely Maybe

The Metascience Movement Needs to be More Self-Critical

Cartoon about researchers who do research on research meeting

Metascience – the study of science and its improvement – isn’t simply a field of science. There’s a movement, too. What’s more, it’s a movement fueled in part with moral force – a drive to change culture and conduct based on strong beliefs about the right way to do scientific work, and why so much that non-converts or the uninitiated do is wrong.

Movements like that enter inherently dangerous territory. As playwright Arthur Miller wrote in his autobiography, “Nothing is as visionary or as blinding as moral indignation”. That always poses a threat to being aware when you’re in the wrong. The rise of charismatic visionaries is pretty much guaranteed, too. And scientists can be dazzled by charisma as much as anyone else. What’s more, they can form echo chambers, and develop “us and them” thinking and loyalties. Conflicts of interest arise as well, as soon as funding, projects, and organizations enter the picture. It’s a potently risky mixture.

That makes issues like accountability and self-critique vital. But it also makes them harder. For example, “us and them” thinking and movement camaraderie lead some people to see criticism of fellow-travelers as “friendly fire” that’s out of line, leading them to keep criticisms to themselves. Advocates can also be very quick to jump into defensive attack mode instead of reflection when anyone criticizes. The combination of deterring incisive analysis and not engaging deeply with confronting thoughts is a shortcut to blinding yourself to movement and theory errors.

I don’t think the metascience movement is doing well enough to avoid movement risks. Although many scrupulously practice what they preach, day in, day out – there are so many great exemplars – I see the opposite far too often. For example, people who relentlessly excoriate others for uncorrected errors, while not ensuring correction notices are issued on their own papers.

We should expect, too, that contributions from metascientists to discussions about ways to improve science culture and practice would aim for the level of scientific standard we expect from science in general. But it’s often not like that at all. I write a fair bit about journal peer review. That’s a great example of how this doesn’t happen in practice. People develop strong convictions based on experiences or cherry-picked studies that support their beliefs. Scientists aren’t intrinsically immune to confirmation bias and its companions.

An almost textbook illustration of this for the metascience movement is the promotion of open badges at journals. For those who haven’t followed this particular cautionary tale, I first tackled this as a case study in bias in open science advocacy in a post in 2017. This practice was propelled by claims of a dramatic effect on open science practice by simply offering authors badges for their papers at journals – based on a 2016 study by advocates that was riddled with obvious bias and a very biased set of data that created a dramatic and compelling graphic. That set off a lot of debate in open science/metascience circles, including a lot of discussion about how so many had overlooked such major scientific flaws.

At the end of 2019, I revisited the issue, and concluded not only had the dramatic claims not been confirmed by others, but signs were pointing in the opposite direction. Since then, the first randomized trial of the practice that I know of was published. The finding? Badges didn’t increase data sharing. But the hype rolls on.

Cartoon public service announcement about confirmation bias

How can we do better than this as a movement? There are several levels we have to consider: how we do our own research, how research is done on the issues we believe in and on our proposals for change, and how we go about advocacy. We focus more on the first of those – how we do our own research – and less on the others, but they all matter. Just because we believe something is good in theory and we’re idealistic, doesn’t mean our idea will have the effects we anticipate. Success of even the best-laid plans isn’t guaranteed – and anything that is powerful enough to have a positive impact can also have unintended effects. What’s more, once you change the environment, or other things change, new problems will emerge. Each solution tends to cause new problems down the line. If we want to be sure we make things better, we have to test our ideas and be critical about our own, and each others’, practices.

Self-scrutiny and -discipline are key, too. We are, unfortunately, better at talking about biases and cataloging them, than we are at controlling them in ourselves. As critical as cognitive de-biasing is, there’s far too little consideration of how to do it, or research into methods that might work. The metascience movement tends to focus more on research and analytical skills, which are vital of course. But so are values, integrity, and cognitive skills, and we need to consider them more. All the technical skills in the world can be totally undone by unacknowledged conflicts of interest, too.

We need a broad approach to risk of bias in metascience, that encompasses our high risk of confirmation bias and ideological thinking, and our intellectual and financial conflicts of interest. We need to be self-critical, and value independent evaluation – especially of our more beloved or conflicted ideas and practices, because that’s high-risk territory. We have to take our roles as critical peers seriously indeed. We need to take our critics seriously and value them – even when they’re antagonistic. And we need to get very good at admitting when we’re wrong. When you’re on an idealistic mission based on correcting the errors of others’ ways, it’s easy to forget that truly solving an existing problem is harder than causing new ones.

~~~~

Series of posts on badges:

Photo of badges

This post is based on remarks prepared for the panel, “Bolstering accountability and self-skepticism in the metascience movement”, at Metascience 2021. I wasn’t able to participate because of a recent bereavement: the panel session is on YouTube.

The cartoons are my own (CC BY-NC-ND license). (More cartoons at Statistically Funny.)

Discussion

Leave a Reply

Your email address will not be published. Required fields are marked *


Add your ORCID here. (e.g. 0000-0002-7299-680X)

Related Posts
Back to top