The recent research project was intended to study how social media and survey data can be used to better understand people’s decision-making process around vaccines. It ended up unmasking some unexpected key players in the vaccination debate: Russian trolls.
These known Russian troll accounts were tied to the Internet Research Agency, a company backed by the Russian government that specializes in online influence operations.
“One of the things about them that was weird was that they tried to — or they seemed to try to — relate vaccines to issues in American discourse, like racial disparities or class disparities that are not traditionally associated with vaccination,” Broniatowski said.
For instance, “one of the tweets we saw said something like ‘Only the elite get clean vaccines,’ which on its own seemed strange,” he said. After all, anti-vaccine messages tend to characterize vaccines as risky for all people, regardless of class or socioeconomic status, the researchers wrote in the study.
The researchers were stunned to find Russian troll accounts tweeting about vaccines, but unraveling why they would stoke the vaccine debate was mind-boggling, too.
Why trolls tweet about vaccines
For the study, the researchers collected and analyzed nearly 1.8 million tweets from July 2014 through September 2017.
While examining those vaccine-related tweets, the researchers discovered many bot accounts, including “content polluters,” which are accounts that disseminate malware or unsolicited commercial content. The researchers also uncovered a wide range of hidden online agendas.
When it came to the Russian troll accounts, the researchers found 253 tweets containing the #VaccinateUS hashtag among their sample. Among those tweets with the hashtag, 43% were pro-vaccine, 38% were anti-vaccine, and the remaining 19% were neutral.
By posting a variety of anti-, pro- and neutral tweets and directly confronting vaccine skeptics, trolls and bots “legitimize” the vaccine debate, the researchers wrote in the study.
“This is consistent with a strategy of promoting discord across a range of controversial topics — a known tactic employed by Russian troll accounts. Such strategies may undermine the public health: normalizing these debates may lead the public to question long-standing scientific consensus regarding vaccine efficacy,” they wrote.
Overall, the researchers found that Russian trolls, sophisticated bots and “content polluters” tweeted about vaccination at significantly higher rates compared with average users.
The study remains limited, in that it’s difficult to determine with 100% accuracy who is behind a Twitter account, and “the Internet Research Agency is certainly not the only set of trolls out there,” Broniatowski said.
Additionally, it’s even more difficult to determine an account’s true intent. But the researchers and other experts have some ideas about why Russia might want to fuel America’s vaccine debate.
It may be a strategy to promote political discord, Broniatowski said, adding, “we cannot say that with 100% certainty, because we’re not inside their head.”
“The Internet Research Agency has been known to engage in certain behaviors. There’s the one everybody knows about, which is the election. They also tend to engage in other topics that promote discord in American society,” Broniatowski said.
So, considering that the agency has engaged in hot-button debates online before to promote discord, the new study suggests that the intent could be the same when it comes to fueling vaccine debates.
Historically, the Russian government has not responded to CNN requests for comment regarding accusations of using social media to influence public opinion in the United States.
The brief use of the #VaccinateUS hashtag among troll accounts could have been an experiment, he said.
“I would call that an experiment that they abandoned,” he said of the hashtag.
Warren added that he was not surprised to learn about Russian trolls posting vaccine-related tweets.
“I don’t know if it would seem strange once you understand their goal, which is basically to divide both sides against the middle. They’re going to grab onto all of those social issues. So for example: black lives matter, all lives matter; immigrants are destroying America, immigrants are great for America,” Warren said.
“It’s basically the hot-button political issues of the day. They’re happy to grab onto whatever is salient,” he said. “I think that they want us focused on our own problems so that we don’t focus on them.
“If most of our energies are focused internally with divisions inside of the United States — or divisions between the United States and, say, Europe — that leaves a window open for Russia to expand its sphere of influence.”
So it seems, such an effort to spread divisive misinformation — including in the form of public health messaging — is nothing new.
Eventually, “[w]ith the end of the Cold War, former Soviet and East German intelligence officers confirmed their services’ sponsorship of the AIDS disinformation campaign,” according to the article.
Messages ‘that aren’t scientifically sound’
“We know that in Italy, the Five Star movement ran on an anti-vaccine platform. I do think it’s worth it to go look at the social media conversation in Italy to see if inauthentic accounts were capitalizing on those divisions or involved in that debate,” DiResta said.
In the meantime, however, she said that the new study on Russian trolls meddling in US online vaccine debates is an example of how there has been growing distrust in science and public health initiatives, such as those underlying vaccinations.
“Both real people and trolls are capitalizing on that mistrust to push conspiracy theories out to vulnerable audiences,” DiResta said.
“This isn’t just happening on Twitter. This is happening on Facebook, and this is happening on YouTube, where searching for vaccine information on social media returns a majority of anti-vaccine propaganda,” she said. “The social platforms have a responsibility to start investigating how this content is spreading and the impact these narratives are having on targeted audiences.”
“There are messages being put out there that aren’t scientifically sound,” Allem said.
“This has the potential to drown out scientifically sound messages from health care providers, and from the public health community in general, on the best way to make a health-related decision,” he said. “When people are looking at these messages, does it matter to them? Does it lead to an attitude change? Subsequently and ultimately, does it lead to a behavior change? Does a person who sees a thread on Twitter discussing the pros and cons about vaccination, does that cause hesitancy for a parent deciding to get their child vaccinated? These are the next sets of questions that will need to be answered.”
News credit : Cnn