“When you see the term ‘preprint’ in a scientific news article, what do you interpret it to mean?” That’s the key question…
Scientific Uncertainty Isn’t a Justification for Getting Things Seriously Wrong
The authors got me thinking about the ways scientific uncertainty enables people to lead themselves, and others, astray. That wasn’t what they were writing about. Their article was about communicating scientific uncertainty as a hedge against loss of trust when there’s new evidence that changes thinking (Dries 2024). It was their opening paragraph that sent me down a different path:
“Early in the pandemic, government representatives, health authorities, and scientists doubted the efficacy of face masks in preventing the transmission of COVID-19. For instance, on February 24, 2020, the German Federal government declared: ‘There is no scientific evidence that wearing mouth and nose protection in public protects against infections with the novel coronavirus.’ However, as the pandemic progressed, new data (and possibly more masks) prompted the government and health officials to revise their statements and endorse the wearing of face masks.”
“New data” was often given as a justification for a U-turn on that one, as it so often is. It’s a handy fig-leaf: There’s always some kind of new data, even if an evidence base hasn’t really shifted. In this case, even when the evidence on using masks hadn’t changed, many reached for new data on how Covid was spread as the rationale for the about-face – as if it hadn’t been an error of judgment to make confident, critical assumptions about the workings of a brand new disease.
Changing your mind in response to new knowledge is a desirable trait, of course. However, I don’t think we should always accept it at face value. It’s too easy to take cover under “this is just how science works,” when in truth you were seriously wrong in the first place.
We have seen a lot of high-profile examples of problematic takes in areas with scientific uncertainty in this pandemic. Scientific uncertainty provides a lot of space for bias to flourish. One of the high-profile dynamics that got a lot of attention was contrarianism. When I wrote about that here, I touched on the issue of uncertainty, pointing to the tactic Freudenberg and colleagues called SCAMs – for “Scientific Certainty” Argumentation Methods. That’s using science’s standards for declaring strong certainty to instil excessive doubt in bodies of evidence.
Freudenberg & co point out, “the framing of science as uncertain is at least as much a symbolic process as a scientific one.” While they focus on “merchants of doubt” – SCAMs to further business interests, doubt-mongering isn’t only the province of people with goods to sell. Debates about the effectiveness of health interventions often feature “a kind of weaponized doubt.” People then typically claim scientific standards unequivocally support their argument – as though a particular bar they have set is an absolute truth rather than their own value judgment, a choice from a range of approaches, full of complexity and nuance.
One of the reasons SCAM-ing succeeds so often is that it’s too easy to fall for a common fallacy – that because there isn’t an impressive amount of a particular kind of evidence of benefit, something does not work. However, as the saying goes, the absence of evidence is not evidence of absence of an effect. It could just be that studies which can reliably answer the question have not been done.
Another common problem is that even good studies that could help point us in the right direction and reduce some of the uncertainty, are ignored. The pandemic gave us a powerful example of how wrong you can be when you don’t make an effort to see what’s already known about a question. The FDA issued an emergency use authorization for convalescent plasma in 2020 using a snapshot of the meagre evidence available in 2013 as a major basis for their very optimistic decision-making. However, between 2013 and 2020 there had been studies which together cast important doubt on the potential of the intervention – including 2 trials from the NIH. (I dug into that in this post.)
Say you have got a good handle on the level of uncertainty, though. That brings us to the topic Charlotte Dries and colleagues were actually writing about in the article that triggered this post. They did a trial addressing the question, does communicating scientific uncertainty serve as a hedge against loss of trust down the line when new evidence contradicts the previous conclusion?
People have very different views on whether you should project certainty, or be very upfront about uncertainty. And the evidence on this is complicated too. I’m firmly in the camp of openness about uncertainty, and I try hard to do it. It’s frustrating, using lots of “weasel words” – especially when others are being adamant and certain, and instilling people’s confidence in them and their unambiguous takes. Is there a cost to that appearance of having a clear answer, though, given that there will be times you have to back-track?
We were often in this space in the conversations about Covid vaccines. The uncertainty didn’t stop people being very, very sure at key points. One was when European drug regulators started to take action as soon as there were reports of serious adverse events that seemed to be associated with the Oxford/AstraZeneca vaccine.
Here’s where you can see the position I took early on in that controversy. I argued that we needed to take the reports seriously and not try to dismiss them, as many were doing at the time in the interests of defending vaccination. If, I wrote, it did “turn out to be a vaccine reaction, even a vanishingly infrequent one, keeping mum won’t make the problem go away. Indeed, it could serve to worsen the effects of the fearmongering about vaccines that will surely grow from here.” We were risking loss of trust in the system, I wrote: “Addressing concerns about vaccines is a long game.” (If you didn’t follow this one at the time, those rare events did turn out to be vaccine reactions.)
Since I wrote that in March 2021, the Dries group did an experiment tackling the question of what happens when an authority expresses certainty and then has to back-pedal after later evidence – and the hypothetical scenario they tested was very similar to that real-life situation. They pointed to another similar trial, published in 2022, which they believe is the only other one testing this. I don’t know of any others either.
In both of these studies, participants were randomized to statements about a vaccine not being responsible for serious adverse events, with or without being open about uncertainty. People’s responses were measured. Then the participants were given another statement, indicating that further evidence showed the vaccine was in fact the cause of the adverse events. In both trials, there was some loss of trust in the authority – but having been clear about uncertainties had what Dries called “a buffering effect:” Less trust was lost by people in the groups that had been informed of uncertainty in the first place.
If you’re really into this question, these are interesting studies. That said, although I’ve had an avid interest in communication evidence for decades now, I’m very critical of its limitations. One of the problems is the artificiality of the communication environment in studies like this, and the selection bias of participants. For example, 44% of the Dries group had a graduate or post-graduate degree.
In both these studies, people were exposed to discrete bits of information – and the conflicting information was given to them in quick succession. In real life, it’s a lot messier and drawn out than that. It’s not even clear how many people hear both versions – although incidents like this are staple fodder for anti-vaccine literature, of course.
Still, this recent research bolsters the case for those of us who believe in transparency about uncertainty. Someone being prematurely very sure may convince others that they really know what they’re talking about. But they are at the mercy, then, of their biases. Pretending uncertainty is not there is a minefield.
You can keep up with my work at my newsletter, Living With Evidence. And I’m active on Mastodon: @hildabast@mastodon.online
~~~~
Disclosures: Shifting bodies of evidence was the focus of my PhD work. I wrote about masks in the Covid pandemic several times – at WIRED in April 2020, later that month on this blog, and recently in my newsletter (September 2023).
The cartoon is my own (CC BY-NC-ND license). (More cartoons at Statistically Funny.)