Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS Absolutely Maybe

A Cartoon Guide to Conflict of Interest Claims, Fair & Foul

 

Claims of conflicts of interest have been one of the – let’s call it interesting – parts of being in debates about vaccines in the last couple of years. Of course, you don’t enter such contested territory expecting only fair argument. But being exposed to so many people’s claims of others’ conflicts of interest in a short time can clarify your thoughts.

I’ve written a couple of reader’s guides to conflicts of interest (COI). In the latest, in 2016, and I argued that the problem of financial COI is getting worse. We’re watching it in full bloom now with Covid-19, and it isn’t pretty. But going too far in the other direction – dismissing too much evidence and too many people because of potential COI – is just a different version of the same problem: we end up mislead about what’s real, and what’s not.

Lisa Bero and Quinn Grundy wrote a paper in 2016 on non-financial disclosures in which they warned of going both too far, and not far enough, on COI. Bad disclosure practices not only don’t help, they argue, they can be counter-productive.

 

Cartoon about conflict of interest declaration

 

So what’s a fair and important potential COI? The first is a very direct financial one: sponsored research by medical product manufacturers of their products, for example, have been shown to be biased in favor of those products. We have to take that into account, but it doesn’t mean that every piece of research conducted with industry sponsorship involvement or sponsorship is biased, or all unfavorable data is hidden, and it certainly doesn’t mean it’s all untrustworthy.

Then there’s the territory of “perceived conflict of interest” and potential non-financial interests. This is where things can really go off the rails. It is just so easy to weaponize COI claims and create a cloud of suspicion.

 

Cartoon of an audience member thinking "He would say that, wouldn't he?"

 

For research or other work to achieve the highest possible level of public trust, it doesn’t just have to be free of the influence of conflicts: it has to be seen to be. But many people will never accept a position that doesn’t accord with their own beliefs – or even anything ever done or said by a person they have a grudge against, or whom they envy or just plain dislike. And then if they dig enough, or get creative enough, they can find a personal COI to try to discredit the person’s research or opinions.

There’s a rich vein to mine here. Sandro Galea has published a typology of potential non-financial conflicts (with a rejoinder by Bero and Grundy). Galea includes career interest, clique-ish academic networks, and ideological conflicts of interest. I agree these can be serious conflicts that lead everyone astray. The next cartoon is from a post I wrote about ideological conflicts of interest while I was working on a vaccine issue:

 

Cartoon of a panel of 3 speakers - pro-industry, anti-industry, anti both of those

 

But the danger of going too far in the direction of dismissing research and views based on people’s interests is that it leads to the same kind of damage that ignoring financial conflicts does: a distorted and unreliable knowledge base.

Here is a triad of COI claim fouls that I encountered while writing and debating about the HPV vaccine – and I expect to encounter to see a lot of these in the coming debates about Covid-19 vaccine, too.

Remember a rule of this fight club, though: all of this only applies if the person or study is a threat to your position. If their work/opinions makes them a standard-bearer for your position, then none of the following could have tainted them.

 

1. Tenuous links in likely inconsequential circumstances

 

This is where an event relatively distant to the person, and/or events so inconsequential to them they might even have forgotten them, are brandished as proof of hidden allegiances.

Often, events couldn’t possibly have been enough incentive for someone to distort their research results – and aren’t a reliable indicator of affinity for a particular point of view, either. For example, once having given a talk at a conference where multiple points of view were represented, and a drug company was one of many sponsors of the conference by virtue, say, of having a paid booth in a hall there, but not of that talk/speaker specifically.

 

2. Guilt by association

 

For example, a person has co-authored papers with someone who has direct conflicts. I saw this one stretch all the way to someone claiming a person was irretrievably conflicted because in some other part of the university, someone else was running a project sponsored by the manufacturer of the vaccine in question.

 

3. Joining dots

 

This is where someone who is somehow involved is a conflicting influence because of a potential COI that’s not even close enough to count as guilt by association. Here’s an example. It was an argument to discredit a study because of a peer reviewer who had originally planned to be a co-author in the early stages of the project, but had pulled out. The peer reviewer worked at the CDC. The CDC is a US government agency. The NIH is another a US government agency. The NIH Office of Technology Transfer oversaw the licensing of a technology used by vaccine manufacturers. Ergo… you can’t trust the study. Whelp!

 

 

I think this starts to have common elements with conspiracist thinking. From a previous post:

Conspiracy theorizing involves investigating details and making arguments, using the jargon and other aspects of academic work. Theories are constructed out of many bits and pieces, with solid facts, unreliable information, and conjecture linked by motivated reasoning. (More about motivated reasoning here.)

When it comes to theories about COI by association, underlying such beliefs, it seems to me, is the false belief that scientists only ever associate and collaborate with people who share their own world views and beliefs – and they never, ever, disagree with each other. Therefore, since birds of a feather flock together, they should all be tarred with the same brush. That thinking is as mangled as what I just did to those aphorisms.

Co-authors of a paper can be in full raging dispute about some of the paper’s contents – to the point, even, of becoming whistleblowers. Academics are often adamant opponents of their university’s actions and policies. People can maintain professional collegiality despite strong disagreements. Yes, some people and institutions can exert great influence on others: but plenty of people resist that influence.

There’s another rule of this fight club: every time someone omits an event, that’s sinister. Even if it’s fairly inconsequential or has only the most distant whiff of financial interest, it’s a cover-up. And that’s enough to discredit someone, to demand retraction of an article they wrote, etc etc.

 

 

But here’s the thing – it’s so easy to forget something that someone else might think matters. On one hand, it’s easy for me: I have since day 1 had a personal policy of refusing to ever accept direct personal funding from the medical products industry, and I’ve always asked where the money is coming from before I accept travel to give a talk to make sure. I include a lot of disclosures on my blog posts, and think hard about them. I’ll still occasionally forget things, especially if they were a long time ago. I think the lengths I go to helps, though, and it certainly demonstrates how serious my concerns about transparency and bias are. But nothing can shield you from the triad I described.

(Pro tip: whenever you prepare a disclosure statement, check all your previous ones on even vaguely related topics. That’s not just a memory aid: some critics will be searching through your previous disclosures to try to find something you didn’t declare this time.)

Bero and Grundy propose some questions for reflexivity when considering researchers’ potential interests, including this one:

How could who they are affect the design, conduct, or reporting of research?

They also stress that

The goal is to hold researchers accountable without discrediting scientific findings and claims as mere “personal biases” and to ensure fair representation rather than excluding certain individuals based on their personal characteristics, beliefs, or expertise.

The muddying of the waters around conflicts of interest serves to erode the political will to identify and manage conflicts of interest in scientific research, as they are portrayed as increasingly ubiquitous and unsolvable.

I agree. However, I also think that we have to interrogate biases and motivations in criticizing other people’s potential COIs, too. Extreme COI-hunting to discredit everyone and everything that doesn’t support a point of view has a “muddying of the waters” danger, too. The noise distracts from what’s often the bigger problem: the ideological or personal conflict of interest of the hunter.

 

~~~~

 

Cartoon of a man reading a meta-analysis with a meta conflict if interest

 

 

More posts on conflicts of interest here at Absolutely Maybe

 

 

Disclosures: Other than freelance writing for commercial media outlets and PLOS, since becoming involved in health and science in the 1980s, my employers have all been not-for-profit community or government agencies, or not-for-profit universities. I have never accepted funding from a manufacturer of a drug, device, or similar health product. More than 20 years ago, I received funding from a not-for-profit health insurer, and once from a private health insurers’ association for participation in a conference. Other than bank and retirement savings accounts, I have never owned stocks or had financial investments. I drafted the Cochrane Collaboration’s first conflict of interest policy. I quote Lisa Bero in this post: she is a personal friend.

[Update 5 May 2020] Thing-I-didn’t-think-of! David Colquhoun’s comment that he had been accused of COI because of source of grant funding (a “foul”) made me realize I should list from whom I have received research grants (or benefited from grants received by others): National Health and Medical Research Council (NHMRC – Australian equivalent of National Institutes of Health (NIH) or the UK’s MRC); Consumers’ Health Forum of Australia (CHF), UK Cochrane Centre, Victorian Department of Health (for the Cochrane Consumers and Communication Review Group – I can’t remember what the Department was called back then), National Institute of Clinical Studies (NICS – an Australian government agency that doesn’t exist any more), intramural NIH, waived PhD university fees under the Australian Government Research Training Program Scholarship.

 

The cartoon at the top of this post is my own (CC BY-NC-ND license). (More cartoons at Statistically Funny and on Tumblr.)

 

Related Posts
Back to top