top of page
  • Writer's pictureMike McCormick

No Vaccine for Russian Disinformation

Updated: Apr 3, 2023



As Americans hunker down for a pandemic, conspiracy theories seem to spread faster than the virus: Covid-19 is caused by 5G networks. Vegetarians are immune to coronavirus. Bill Gates is promoting a vaccine in order to implant us with microchips. Muslims are selling food contaminated with coronavirus. Coronavirus escaped from a secret [CIA / Q-Anon / Chinese / Israeli / Canadian] weapons lab.


Where do these stories originate and how do they go viral? Many of them originate from the same folks in Russia who brought us the 2016 election disinformation campaign. Once again, they’re using social networks like Facebook and Twitter to spread false conspiracy theories. As before, their goal is to sow chaos, undermining American society and institutions.

The New York Times reported on extensive Russian influence operations spreading covid-19 disinformation. The State Department uncovered thousands of fake social media accounts tied to Russian agencies promoting covid-19 conspiracies. Russia Today (RT) aggressively spreads the stories on its YouTube channel.

Lea Gabrielle, head of the State Department’s Global Engagement Center, told a Senate hearing that “the entire ecosystem of Russian disinformation has been engaged.” Her analysts have found “Russian state proxy websites, official state media, as well as swarms of online false personas pushing out false narratives.”

Clemson University and Five Thirty-Eight analyzed 3 million Russian troll tweets. Clemson professor Darren Linvill attributes the current disinformation campaign to “a cloud of Russian influencers,” including GRU intelligence operatives, RT staff, Internet Research Agency (IRA), and oligarch Yevgeny Prigozhin, a Putin confidant who financed the IRA’s 2016 operations.

A Wired case study of the 5G covid-19 conspiracy theory illustrates how disinformation spreads. The 5G rumor began with an obscure Belgian newspaper story, quoting a local doctor’s claim that “5G is life-threatening.” His comments were picked up on Facebook by anti-5G campaigners in the Dutch-speaking world. Russia took notice, and RT began amplifying the 5G story on English-language social media. RT broadcasts linked it to the emerging coronavirus threat. Celebrities including Woody Harrelson and Robert F. Kennedy, Jr. retweeted 5G stories, further amplifying them. Now it has spread from cyberspace into the real world, where angry protesters are setting 5G antennas on fire.

The psychology behind this seemingly crazy behavior is actually simple. “The coronavirus has created the perfect environment for this message to spread,” says Josh Smith, a researcher at think tank Demos. “Like many conspiracy theories, the idea that 5G is to blame for the uncertain, frightening situation we find ourselves in is a comfort. It provides an explanation, and a scapegoat.”

Covid-19 conspiracy theories aren’t confined to fringe social media or web sites. TechCrunch reports that AI company Yonder studied online conversations including coronavirus misinformation, and found that conspiracies that would normally remain in fringe groups are traveling to the mainstream faster. “In the current infodemic, we’ve seen conspiracy theories and other forms of misinformation spread across the internet at an unprecedented velocity,” said Yonder’s Ryan Fox.

The World Health Organization (itself the target of propaganda and conspiracy theories) has declared this tidal wave of misinformation an infodemic that significantly amplifies the damage from coronavirus, causing people to take health risks, ignore social distancing guidelines, and refuse vaccination.

What can be done? While online disinformation isn’t a traditional cybersecurity issue, some of the methods we use to track and contain malware viruses can be applied here. Leading cybersecurity expert Bruce Schneier elaborated an 8-step killchain for online influence operations like Russia’s, including countermeasures at each step of the killchain that might short circuit an infodemic.

Some of Schneier’s countermeasures are technological. For example, social media companies could do more to detect and delete bot and troll farm accounts, or tag fake news stories with warning labels. Facebook recently announced it’s going to alert users exposed to covid-19 misinformation. Users will receive a message notifying them they have seen a since-deleted post and connecting them to a list of COVID-19 myths that have been debunked by the World Health Organization (WHO).

Think of it as contact tracing for the infodemic.

"We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook," Facebook VP Guy Rosen, wrote in a blog post. During March Facebook applied warning labels on about 40 million posts related to the pandemic, based on 4,000 articles reviewed by third-party fact checkers.

We should applaud Facebook’s efforts and encourage other social media companies to follow suit. We should pursue other countermeasures, both technical and social, at every step of the disinformation killchain. We should demand Russia cease its dangerous behavior or face grave consequences. We can do all these things without undermining Americans’ First Amendment rights to free speech.

This is a crucial test for America. Let’s not allow a dangerous cyber infodemic to exacerbate a real world pandemic.

 

Michael McCormick is an information security consultant, researcher, and founder of Taproot Security.

bottom of page