How government outreach can combat election disinformation

Westend61/Getty images

COMMENTARY | We need a solution in which messages are non-partisan, easily accessible and easy to understand.

I have taught technology and privacy law classes at the University of Illinois Chicago Law for over a decade. In recent years, the impact of technology and the law on elections has been an area of increasing interest for my students. I teach my students how to recognize disinformation and misinformation campaigns and understand how nation-states and adversarial groups can manipulate the perceptions of targeted sections of the public and influence their votes.

In many cases, targeted voters can even be persuaded into voting against their own interests, through selective introduction of biased information into their social media feeds. I want them to leave my class equipped to contribute to our society, as educated and informed voters.

At the same time, I realize they are only a tiny portion of the population. Most voters are not proactively taught about the ways in which their opinions and beliefs are actively being manipulated.

In 2024, U.S. citizens will be charged with selecting the next leader of their country. They must be prepared to confront the widespread dissemination of false news, misleading memes and deepfake content that political and global adversaries will use to manipulate them to influence their votes. The emergence of compelling generative artificial intelligence tools and the ease with which they can be used to create persuasive text, visuals and video at minimal cost, will accelerate and compound the impact of disinformation campaigns.

How can our societies resist these attempts to subvert the democratic process, through the manipulation of our worldviews? Citizens, young and old, must learn new techniques to validate their news sources. They need to know how to look at different platforms with a skeptical eye, and to reconcile what they say with what government websites, other news outlets and other respected organizations are reporting. Consulting reliable sources like the Netherlands-based Defend Democracy, the University of Pennsylvania’s FactCheck.org, and the National Conference of State Legislatures will help distinguish fact from fiction. Citizens should act responsibly on social media platforms, by confirming their facts before sharing, and not impulsively liking or forwarding unsubstantiated content.

One of the biggest challenges is that most people trust their friends and families the most — but those friends and families may be targeted by the same disinformation campaigns! Education and awareness campaigns are needed to counter the onslaught of disinformation and misinformation intentionally designed to confuse voters. The good news is that there’s a proven model for addressing manipulative messaging — one that’s been tested by cybersecurity leaders in both government and the private sector. Any cybersecurity leader will tell you that we can’t prevent everyone in a company from making mistakes and clicking a link in a phishing email. But we can and do minimize the number of people that these malicious messages victimize. How? By training them to recognize tell-tale signs, to spot red flags, quarter after quarter, and year after year.

We teach our people how to spot phishing emails, not to open emails that look a certain way, not to respond to them if opened, and what to do if they accidentally click a link. These lessons have reduced the damage attributed to phishing within many organizations, sometimes drastically. According to a 2022 survey by Proofpoint, 84% of U.S. companies reported that security awareness training reduced their phishing failure rates. We can use similar tactics to show people how bad actors try to manipulate them and their families, and we give them tools to resist psychological manipulation. If we can eliminate 84% of the disinformation shared on social media, our societies will be better off for it.

Most security departments already enable security awareness communications, periodic educational requirements, and on-demand training. The delivery infrastructure for these programs can be repurposed to train people to recognize election disinformation techniques, as well as malware delivery and fraud enticements. Governments can deliver this type of message via public service announcements on streaming media, broadcast TV and radio, billboards, signage in public buildings, social media posts and other means.

The U.S. government has traditionally sponsored PSAs and public safety awareness campaigns. Unfortunately, that would not be highly effective in this case, since a significant portion of the population does not trust our government to perform this task without bias.

We need a solution in which messages are non-partisan, easily accessible and easy to understand. This could be done by creating an independent entity responsible for vetting and countering misinformation and disinformation. This isn’t without precedent. For example, in cases where non-partisan investigations are required to determine whether to charge a political figure with a crime, the attorney general has the authority to appoint an independent special counsel to investigate and report back.

Similarly, Congress could appoint an election information ombudsman to independently vet election-related misinformation, without political bias. The ombudsman should be focused solely on assessing the accuracy of the information in question and should have no financial or political interest in the outcome of their reviews. This appointee could be subject to congressional approval but remain independent of congressional or political influence.

Congress can also help by quickly passing legislation aimed at educating U.S. citizens about election misinformation. For example, the Digital Citizenship and Media Literacy Act, introduced by Sens. Amy Klobuchar, D-Minn., and Michael Bennet, D-Colo, along with Rep. Elissa Slotkin, D-Mich., would strengthen voter literacy education by teaching Americans to identify online misinformation and disinformation. It calls for a grant program at the Department of Commerce to teach digital citizenship and media literacy skills. Klobuchar and Bennet have also introduced the Veterans Online Information and Cybersecurity Empowerment Act to create a grant program at the Department of Veterans Affairs to teach veterans digital and media literacy skills and how to identify disinformation and online scams. Even if funded by the government, this education will be most effective if delivered by a neutral party.

In an era in which technology evolves faster than people, all voters would benefit from knowing how to protect themselves from manipulation of their beliefs, and to ensure their votes are based on accurate information. While we should not expect to see an immediate end to voter misinformation, we should push forward sooner rather than later.