Former Texas Rep. Will Hurd weighed in.
The spread of disinformation and misinformation is a worldwide crisis, and governments around the globe are working hard to find ways to rein it in.
From the pandemic to climate change to foreign policy, people everywhere are being deceived and manipulated online by bad actors motivated by ideology, greed or other hostile motives, said former U.S. Representative Will Hurd and Damian Collins, the U.K.’s Member of Parliament and chair of its Joint Committee on the Draft Online Safety Bill. Both participated in a Washington Post webcast released Jan. 11.
“What’s illegal offline should be regulated online,” Collins said. “We know there are bad actors, state sponsored, attempting to interfere with elections, [leaving] people in a position to not know what to believe. [The] Covid pandemic has demonstrated just how dangerous disinformation can be in the area of public health … Drinking bleach to cure Covid could cause someone real harm. This isn’t an example of free speech.”
“One of the ways I describe it is [the] increasing disagreement on facts, blurring of the lines between fact and fiction,” Hurd said, “in the volume of influence and complete degradation of trust in institutions, from the federal [government] all the way down to the local level … MITRE calls it ‘truth decay.’”
Both men are advocating for governments to step up and step in much more aggressively to address the issue, both in their own countries and internationally.
“One of the failures here is the failure to translate existing safeguards to protect society in the modern world,” Collins said. He cited U.S. election laws that prohibit foreign contributions to American political campaigns as one example, noting there is no enforcement mechanism to address foreign interests purchasing online political ads. “It’s the failure of regulation to keep up with technology.”
In the U.S., much of the discussion has focused on revising Sec. 230 of the Communications Decency Act, which carved out an exemption for internet companies to host third-party content without being considered “publishers” responsible for the accuracy of that information. As lawmakers have learned more about the workings of social media platforms like Facebook and TikTok, including the role that algorithms play in pushing extreme content to their users in order to drive greater engagement, a consensus is emerging that changing their exemption status is critical.
“The far left and the far right sometimes have the same opinions about what to do about [Sec. 230], for completely opposite reasons,” Hurd said. The discussion is “whether these platforms are indeed publishers, should they have to follow the same rules as other [publishers]? … If so, then I think you’d see behaviors change.”
He suggested one way to make such a revision work would be that “if you’re open to the public, if your account [is one] that anyone can see, you should have to follow the same rules and regulations that other publishers have to follow … Now if you’re closed, only people who have subscribed can see your account, then you’re not considered a publisher because you’re [only talking to your friends].”
Hurd said the U.S. government should look at how it fought online disinformation spread by Islamic extremists during the Obama administration. “We knew who they were targeting,” he said. “We were able to inoculate some of the community against those messages. That’s the type of model the government can use … Make sure people are able to distinguish fact from fiction when they read it on their devices.”
Hurd said that state governments also can do more to fight disinformation on topics such as elections, by educating their citizens on the entire election process, including information about how voting machines are secured before and after Election Day and how mail-in ballots are processed, “so that when somebody on social media says something crazy you can say ‘that’s not how the process works.’”
Collins, the co-founder of the International Grand Committee on Disinformation, said partnership between nations on the issue is critical.
“What we’re seeing in technology regulation around the world is a leveling-up process,” he said. “The U.K. with the online safety bill, the first to establish a [baseline].” Australia and other countries are considering similar measures, Collins said.
As countries move to try to get mis- and disinformation under control, Collins said it would be valuable for regulators in each country to be able to share their insights. “Tech companies themselves often can’t explain what they really need,” he said. “Standardizing the powers regulators have. Our best hope [for] this is better regulation, better [regulatory] bodies, and better enforcement needs to be taken … Companies will never do this on their own. We need to create proper structures.”