UCare generously supports MinnPost’s Second Opinion coverage; learn why.

Russian trolls and malicious bots fuel discord over vaccines, study finds

REUTERS/Thomas White
The study found that some of the same Russian trolls who interfered in the 2016 U.S. presidential election also used Twitter to polarize public opinion regarding the safety and effectiveness of vaccines.

It’s not just political elections that Russian internet trolls have been trying to influence with a divisive misinformation campaign. 

According to a study published last week in the American Journal of Public Health, Russian trolls — along with bots spreading malware and spam — have also been sowing discord by disseminating inaccurate and highly divisive messages about vaccines.

In fact, the study found that some of the same Russian trolls who interfered in the 2016 U.S. presidential election also used Twitter to polarize public opinion regarding the safety and effectiveness of vaccines.

Most, but not all, of the tweets posted by the trolls and bots were anti-vaccine. The strategy appeared to be one of “amplification” — to create impressions of false equivalence or consensus around an issue. 

“The vast majority of Americans believe vaccines are safe and effective, but looking at Twitter gives the impression that there is a lot of debate,” says David Broniatowski, the study’s lead author and an assistant professor of engineering and applied science at George Washington University, in a released statement. “It turns out that many anti-vaccine tweets come from accounts whose provenance is unclear. These might be bots, human users or ‘cyborgs’ — hacked accounts that are sometimes taken over by bots.”

“Although it’s impossible to know exactly how many tweets were generated by bots and trolls,” he adds, “our findings suggest that a significant portion of the online discourse about vaccines may be generated by malicious actors with a range of hidden agendas.”

Study details

For their study, Broniatowski and his co-authors examined thousands of vaccine-related tweets sent between July 2014 and September 2017.  They found that a significant proportion of them were promulgated by bots (accounts that promote automated content, usually marketing-related spam or links to malware) or trolls (individuals who misrepresent their identity to fuel discord). 

Furthermore, the researchers were able to link many of the troll accounts to the St. Petersburg-based Internet Research Agency (IRA), which special counsel Robert Mueller named last February in an indictment for alleged election meddling.

A deeper dive into the data revealed an interesting difference between the bots promoting unsolicited ads, malware and other malicious materials — the accounts the study’s authors refer to as “content polluters” — and the Russian trolls. 

Both tweeted messages about vaccination at significantly higher rates than the average Twitter user. The bots, however, spewed out primarily anti-vaccine content, while the trolls tended to post both pro- and anti-vaccine messages.  

The bots appeared to be using anti-vaccine content “as clickbait to drive up advertising revenue and expose users to malware,” the researchers write. The Russian trolls, however, had a different goal.

“These trolls seem to be using vaccination as a wedge issue, promoting discord in American society,” said Mark Dredze, the study’s senior author and an associate professor of computer science at Johns Hopkins, in a released statement. “However, by playing both sides, they erode public trust in vaccination, exposing us all to the risk of infectious diseases. Viruses don’t respect national boundaries.”

#Vaccinate US

The researchers also analyzed 253 tweets from the Russian troll accounts that used a specific hashtag, #Vaccinate US. They found that 43 percent of the messages with this hashtag were pro-vaccine, 38 percent were anti-vaccine and 19 percent were “neutral.” But no matter which side of the issue the tweets espoused, the messages were linked to U.S. political issues.

“#VaccinateUS messages included several distinctive arguments that we did not observe in the general vaccine discourse,” the researchers write. “These included arguments related to racial/ethnic divisions, appeals to God, and arguments on the basis of animal welfare. These are divisive topics in US culture, which we did not see frequently discussed in other tweets related to vaccines.”

Here’s an example of a divisive anti-vaccine tweet from the Russian trolls: “Apparently only the elite get ‘clean’ #vaccines. And what do we, normal ppl, get?! #Vaccinate US.” 

And here’s an example of a divisive pro-vaccine tweet from those trolls: “#VaccinateUS You can’t fix stupidity. Let them die from measles, and I’m for #vaccination!”

The study also found that the Russian troll accounts sometimes used two hashtags (#Vaxxed and #CDC Whistleblower) associated with Andrew Wakefield, the discredited former British physician who published a 1998 paper that essentially launched the anti-vaccine movement by fraudulently suggesting a link between vaccines and autism.

Don’t feed the trolls

As this study makes clear, a significant proportion of anti-vaccination messages on social media are organized “astroturf” — in other words, they don’t represent a grassroots movement against vaccines. The aim of those messages is solely to get Americans riled up about the issue.

The study’s authors end their paper with a recommendation that public health officials “focus on combating the messages themselves while not feeding the trolls.” They suggest that officials point out to the public that social media messages regarding vaccines have dubious credibility and are often designed to compromise their computers and other devices. 

“Content polluters seem to use anti-vaccine messages as bait to entice their followers to click on advertisements and links to malicious websites,” says Sandra Quinn, one of the study’s authors and a professor of public health at the University of Maryland, in a released statement.

“Ironically, content that promotes exposure to biological viruses may also promote exposure to computer viruses,” she adds.

FMI: You can read the study in full on the American Journal of Public Health’s website. Normally, the journal’s articles are behind a paywall, but its media rep told me they have had so much response to this study that they decided to make it available to everybody.

Comments (1)

  1. Submitted by Ray Schoch on 08/28/2018 - 10:39 am.

    Thanks for this

    Good information, Susan, and timely, given the ubiquitousness of social media.

Leave a Reply