Governments Need Clear Plans to Respond to Disinformation, Experts Say

Matteo Benegiamo/Shutterstock.com

One of the tricky parts of responding is addressing the conspiracy theories that domestic social media users create but then get amplified by foreign actors, experts said. 

Just one day after Facebook announced it removed four disinformation networks, including one it says was linked to ex-Trump campaign official Roger Stone, experts say governments have not defined concrete stances for responding to disinformation campaigns. 

“This is going to be an ongoing problem because democracies really broadly speaking haven't yet figured out what their threshold of response and deterrence is to influence operations,” Jacob Wallis, senior analyst at the Australian Strategic Policy Institute’s Cyber Policy Centre, said at an event hosted by Twitter and the Carnegie Endowment for International Peace’s Partnership for Countering Influence Operations. 

In two panel discussions Thursday, the first in a series, researchers and academics from all over the world gathered to discuss what they know about information operations on social media and what should be done about them. 

But as the panelists compared notes from the research they conducted using data released by Twitter and tossed around ideas for the future, one sticking point stood out. Finding strategies to combat disinformation is difficult, but doing so without a clear picture of government aims makes it harder. 

Wallis said bad actors will continue to push the envelope until governments settle on what kind of response to disinformation they are looking for. Those responses could range from soft-power counter-messaging to more significant actions, like offensive cyber operations. 

“We’re still feeling our way a little, here,” Wallis said. 

Disinformation campaigns have had a documented impact on the 2020 elections already. Last month, The New York Times probed how a made-in-America conspiracy theory related to the failed Iowa caucus reporting app was picked up and amplified by Russian trolls. Last year, during his testimony in front of the House Intelligence Committee, Special Counsel Robert Mueller warned lawmakers that Russia was conducting its information operations “as we sit here.”

One of the difficulties with settling on a strategy to combat the spread of propaganda is that pinning down the source with high degrees of confidence remains an elusive practice. In the case documented by the Times, the origin of the conspiracy wasn’t Russian trolls—it was only amplified by Russia’s Internet Research Agency. 

Graham Brookie, an alum of the Obama Administration's National Security Council and the current head of the Atlantic Council’s Digital Forensics Research Lab, noted this as an important shift in tactics from the blunt force strategies used in the 2016 election cycle. 

Brookie said the shift took the IRA toward more nuanced operations by “taking, basically, the most useful bits of disinformation that were created by domestic actors and pushing them a little bit further.”

The ramifications of this shift are as of yet unclear, but it could mean that governments and regulators cannot target foreign actors without also taking a stand against fake news born on American soil. Twitter’s decision in May to place fact-check warnings on several of President Donald Trump’s tweets may be indicative of this state-of-play—rooting out disinformation at home may be a preventative measure for curbing the effectiveness of trolls trying to interfere in American politics. 

Brookie added that disinformation campaigns aren’t viable without a pre-existing demand for fake news from domestic audiences. Ben Nimmo, another panelist and the head of investigations at research firm Graphika, underscored the idea that disinformation starts at home. 

This means any actions by the government targeting disinformation perpetrated by foreign actors will be somewhat toothless until media literacy at home improves. If the demand for the kind of disinformation that confirms biases dies down, information operations run by bad actors will be less effective. 

But finding the appropriate mouthpiece to educate communities that have been targeted by disinformation campaigns is not as straightforward as putting out a government public service announcement. If targeted communities don’t intrinsically trust the source of information, they won’t take it seriously. 

Renée DiResta, a researcher at the Stanford Internet Observatory, zeroed in on this problem while studying the social media strategies of ISIS. DiResta said person-to-person relationships are often most effective for educating people. But it’s not yet clear how that process can be scaled for the internet. 

“Nobody is going to be dissuaded from going and joining a terrorist organization because the State Department tweets at them,” she said.