Skip to content

PLOS is a non-profit organization on a mission to drive open science forward with measurable, meaningful change in research publishing, policy, and practice.

Building on a strong legacy of pioneering innovation, PLOS continues to be a catalyst, reimagining models to meet open science principles, removing barriers and promoting inclusion in knowledge creation and sharing, and publishing research outputs that enable everyone to learn from, reuse and build upon scientific knowledge.

We believe in a better future where science is open to all, for all.

PLOS BLOGS Absolutely Maybe

A Communication Research Unicorn! Decent Evidence You Can Change People’s Minds & Actions Via Social Media

Cartoon of woman tweeting

The field of research on communication interventions is heavily littered with studies that are too small, too indirect, and too riddled with other problems. Which is frankly depressing. But every now and then a gem pops up, and I think the pandemic has delivered one – decent evidence about an actionable intervention!

The intervention was by a group of doctors and nurses in the US from diverse backgrounds. Their project was supported by the Poverty Action Lab at MIT – whose affiliates have conducted over 1,000 randomized evaluations in 91 countries (Twitter account).

The intervention was a plea by a doctor or nurse – or a group of them – to try to stop people spreading Covid-19 at Thanksgiving and Christmas in 2020. It was just a short script, nothing fancy – a sincerely delivered 20-second message, recorded on phones:

This Thanksgiving [or Christmas], the best way to show your love is to stay home. If you do visit, wear a mask at all times. I’m [Title/ NAME] from [INSTITUTION], and I’m urging you: don’t risk spreading COVID. Stay safe, stay home.

You can see examples of these DIY PSAs at the project page here (or here on Facebook), including with groups of doctors. This is one from Fatima Cody Stanford:

The team had already done 2 randomized controlled trials of short video messages. This time, though, they went big: 820 counties in 13 states were randomized for a Facebook campaign using the videos as ads sent to millions of users – and setting the bar for success at whether or not mobility and new cases of Covid-19 were lower in the zip codes that didn’t get the ads. And they did it twice: once for Thanksgiving, then with even more ads at Christmas. (Although in the Christmas campaign, it was 767 counties – because of the extreme political polarization in the US in December, they removed very rural heavy Republican-voting counties to avoid adverse events.)

Here’s how this cluster trial worked: they randomized counties into 2 groups – high or low intensity intervention. In the high-intensity counties, 75% of zip codes got the ads, versus 25% of the low-intensity zip codes. (The zip codes that didn’t get ads at all were controls.) All up, at least 1 ad was posted to over 35 million Facebook users – the population of the 13 states was about 120 million. And there were more than 10,000 zip codes covered by the study (with Covid-19 data identifiable for close to 8,000 of them from state health department websites).

The mobility data was only available at the county level, not for zip codes – so the only mobility comparisons possible were high-intensity versus low-intensity intervention. They used 2 metrics in the publicly-available data from Facebook users who have opted to allow their mobile devices to be tracked by Facebook – one for how much people moved around, and the other for the proportion of people who stayed within a very small geographic area all day. So this is a pretty rough measure.

Emily Breza, Fatima Cody Stanford and colleagues have just reported on the results (preprint; trial register entry; statistical analysis plan). On average, the ads were served up by Facebook 3 to 5 times per user exposed. According to Facebook data, each time an ad popped up, more than 1% of people watched at least 15 of the 20 seconds (and another 11% watched it for between 3 and 15 seconds). The engagement rate was over 12%, which is high. That’s a Facebook metric that combines clicks, short views, sharing, “liking”, or commenting, then divides them by the number of times the ad was displayed. So quite a lot of people in the areas studied could have been influenced by the videos (as well some from outside, given there was some sharing).

To have an impact on Covid rates, people had to either be less likely to travel or have people over to their house, and/or wear masks more. People in the counties with high-intensity exposure to the videos weren’t more likely to stay put on the actual day of the holiday than people in the low-intensity countries. But they were likely to travel less in the 3 days prior – it amounted to a drop of 1 percentage point of distance traveled. And there was no difference according to political leanings of the county.

The researchers tested the rates of new cases of Covid-19 in a number of periods clear of Christmas, and the rates were similar between the zip codes peppered with the ads and those that weren’t. But in the 2 weeks that started 5 days after each of the holidays, there was a very small difference: it was a reduction of about 3.5% (0.035; CI -0.062 to -0.007). The authors conclude the videos made a difference, and I think that’s plausible – and consistent with what we know about the ability of doctors and nurses to influence people’s decisions.

It’s not dramatic, but it’s enough for me to be convinced it’s worth sharing these kinds of short personal videos – and it’s worth doctors and nurses making these kinds of pleas. Personal influence is critical in tipping many individuals into preventive action. In a recent poll in the US, about 20% of people who were reluctant to get the Covid-19 vaccine in January but changed their minds, said it was a family member or friend that convinced them.

This may seem underwhelming. But this study really is something of a unicorn. As critical as this topic is, decent experiments with hard outcomes with the actual community are difficult for communication, and so most research takes easier routes. It’s no wonder that we end up in situations like the debunking of debunking advice that happened a few years ago – when researchers who had heavily promoted a backfire effect if you repeated misinformation while trying to correct it, back-pedalled. There never had been good evidence for the original claim. (I wrote about that here.)

The US government just released a Surgeon General Advisory on Confronting Health Misinformation. It’s pretty good, but the recommendations don’t have a strong evidence behind them. The paper has 62 references – and there are far more opinion pieces among them than rigorous experiments or systematic reviews of rigorous experiments. The evidence they’re relying on is great for describing the problems, but weak on proof of effective solutions.

For example, they recommend improving media literacy. Sounds sensible, but how? A few weeks ago a rapid evidence assessment on this question for the UK communications regulator was released. Experiments that have actual real-world outcomes, they concluded, were “rare” – evidence of “actual behaviour change” isn’t common at all, much less in groups of people representative of populations other than university students from rich countries (usually the US) and the people on Amazon’s Mechanical Turk. That’s a source of cheap online research participants that can be better than the students, but it’s still a proxy for real communities.

We’re stuck in this place where we understand the problems increasingly well – but we have a bunch of essentially vague solutions with pretty weak scientific support. A big thank you to Breza and Cody Stanford and all concerned for delivering us a solid brick to build on.

~~~~

The cartoons are my own (CC BY-NC-ND license). (More cartoons at Statistically Funny.)

Related Posts
Back to top