Inside America’s next information war

ajijchan/Getty Images
Washington is paring back its defenses against influence operations, even as adversaries supercharge them with AI. Much of the fight is shifting to the private sector.
Earlier this year, as North Korea began sending more soldiers to Russia to assist in its war against Ukraine, Maggie Feldman-Piltch turned to a group of adult content creators for their help.
The creators had noticed an uptick in subscribers from the DPRK who suddenly had access to a less restrictive internet environment than they were used to back home, including adult content recommended by their Russian counterparts.
Feldman-Piltch requested that the creators do things such as open a refrigerator on camera or casually mention needing to go to a doctor’s appointment while filming.
A short time later, a North Korean soldier was interviewed by Ukrainian media, where he talked about wanting to experience ordinary activities like going to a grocery store and pushing a shopping cart. Several of the creators recognized the soldier’s voice as one of their clients. It was mission success.
The DPRK is deemed one of the most restrictive and oppressive nations on the planet.
“This is a group of people who probably haven’t seen a full-size refrigerator, let alone one filled with food,” Feldman-Piltch said. Black Iceberg Holdings, her company that helped steer this narrative effort, has been operating in stealth for more than a year. Nextgov/FCW is the first to report its existence.
Such an effort is known as an influence operation — a coordinated attempt to sway public opinion and decision-making. It’s a tactic that has existed for quite a while.
During World War II, William Joyce, known widely as Lord Haw-Haw, was a Nazi propagandist, delivering English-language broadcasts to the United Kingdom and other allied nations to sow fear and dismay. In the United States, the Treasury Department’s Writers' War Board sought to counter those efforts with their own pro-American content.
Years later, consumer electronics and social media have made influence campaigns more covert and scalable. U.S. intelligence agencies concluded that Russia exploited those tools in an attempt to sway the 2016 election in Donald Trump’s favor. Trump and his allies have rejected that finding, branding it the “Russia hoax.” His administration, and now Director of National Intelligence Tulsi Gabbard, have advanced the claim that spy agencies and law enforcement were weaponized against him to discredit his campaign and undermine his presidency.
Upon Trump’s return to the White House, the government has scaled back many of the nation’s offices used to track and counter influence operations targeting Americans at home, on grounds that they were ultimately used as political tools to censor Americans’ free speech.
Those offices include the Foreign Malign Influence Center in the Office of the Director of National Intelligence, as well as the FBI’s Foreign Influence Task Force. They were built to identify and disrupt covert campaigns by foreign governments that spread disinformation, amplify divisions and undermine confidence in U.S. elections and institutions.
Scaling back those offices comes amid a time when advanced AI tools have fueled enormous capabilities to produce and disseminate content faster than ever before. Black Iceberg is just one part of a growing digital ecosystem of private-sector firms working to tip the balance of views about the United States and democracy around the world. But foreign adversaries are doing the same.
China, namely, has engaged at least one private company that uses advanced AI tools to carry out influence schemes, building data profiles on American lawmakers and other major figures, a recent Vanderbilt University analysis found. And a year ago, U.S. assessments concluded that Russia conducted sustained efforts with companies to launch disinformation campaigns leading up to the 2024 election.
Now, new opportunities are presenting themselves for foreign rivals: A slew of major gubernatorial elections is around the corner, midterm elections are set for next year and a presidential election is coming in 2028.
U.S. offices that keep their eyes on these matters have been reduced and restructured, and the age of advanced AI tools is just beginning.
The U.S. is not doomed, experts in cyber and influence campaigns said, but it will now take a whole-of-nation approach — government, cybersecurity companies, creators, artists, brand experts and more — to combat what they see as a major upgrade in information warfare capabilities in the years to come.
“It’s one of the biggest challenges we have in the world right now,” said Dave DeWalt, CEO of cybersecurity venture capital firm NightDragon, whose portfolio companies include firms that track influence efforts. “Where do I go for the truth? How do I figure out the truth? What is AI-generated? What’s real?”
Individual companies, big or small, will have to contend with these issues and tap into diverse streams of data from social media platforms and other online sources to keep pace with these threats, he said. Viral trends fueled by social media influencers and betting platforms could be hijacked to help covertly exacerbate claims that aren’t real.
“You can see one example after another where … you have this hyperlocal influencer that can get magnified in scale with bots and networks, and suddenly you have a risk to your brand and a risk to your value. And there is no hack, not at least a digital hack,” he said. “This is a brand breach problem, it’s coming to a theater near you in a big way, and we’re just seeing the beginning of this.”
AI tools have indeed drastically increased the scalability, reach and quality of influence operations targeting the United States and its allies, said Deric Palmer, the former assistant special agent in charge for the cyber field office of the Army’s Criminal Investigation Division and now managing director of Asc3nd Tech’s open-source and human intelligence portfolio. While in service, Palmer developed the Digital Persona Protection Program that provides defense cyber measures to key personnel like the defense secretary and the Army’s chief of staff.
He often dealt with social media accounts that impersonate officials. Last year, his team identified and mitigated 242,000 of them. His shop was only a group of three people. Often, they would rely on private-sector contractors to automate the search and identification process of the sham accounts.
“I wouldn’t have been able to do it on my own,” he said. “So I don’t think just one government entity can do it.”
Social media platforms largely rely on users to report fake accounts, which also made the job challenging, he added. Agencies like the FBI and Cybersecurity and Infrastructure Security Agency had worked with social platforms to facilitate takedowns of content deemed false. Those agencies have largely pulled back from coordinating content takedowns since around mid-2023, in part under the pressure of conservative lawsuits alleging their communications with social platforms amounted to censorship of free speech.
Palmer estimates that hundreds of millions of impersonation accounts are present on social media sites right now. Some could theoretically be engineered to influence populations and large-scale decision-making, depending on who owns and controls them. They’re easy to stand up with generative AI tools, which can also be used to craft more realistic-sounding materials for social feeds.
“The first thing between the private sector and the public sector and what leaders need to do is admit that there’s an information operation problem,” he said.
Inside the Office of the Director of National Intelligence, a recent overhaul has shifted the Foreign Malign Influence Center’s functions under the agency’s National Counterintelligence and Security Center and National Intelligence Council. An ODNI release argues that a previous version of FMIC was redundant and infringed on Americans’ constitutional rights when it coordinated with social media firms.
Not much else has been made public about the dissolving of the center, listed as part of a broader “ODNI 2.0” plan announced by Gabbard in August. It’s also not clear if ODNI has named an election threats executive responsible for leading the intelligence community on election security for major races like the 2026 midterms.
Asked about this, an ODNI official said in an email that, under the Biden administration, “the Election Threats Executive was replaced by the Foreign Malign Influence Center” and that “the function changed names from ETE to FMIC and then grew in size.”
ODNI opened the Foreign Malign Influence Center in 2022 and placed the election threats executive — created in 2019 under Trump’s first term — within that center. The executive is typically an individual assigned to coordinate spy agencies’ election threat responses and to oversee an experts group that analyzes intelligence on foreign interference efforts.
ODNI did not respond to a request for clarification on the status of the executive.
“Our country’s ability to counter foreign influence operations has never been stronger, and that is thanks to President Trump’s leadership and DNI Gabbard’s historic efforts to transform ODNI into the most actionable and efficient version,” ODNI Press Secretary Olivia Coleman said in a statement. “As we’ve said publicly on numerous occasions and as Congress has been told directly, core functions and expertise to ensure the safety, security and freedom of the American people have not been affected.”
Tracking information operations is a delicate balance, said one former U.S. official, who requested anonymity to speak freely. Intelligence analysts often struggle to balance weighing the potential harm of suspected propaganda against the principle that Americans should be free to see and judge information for themselves.
Reassessments of U.S. work on influence operations are normal and tend to swing back and forth in emphasis, the ex-official said. But regardless of those shifts, foreign adversaries will keep using information warfare as a daily tool — especially to create distrust at home — and those dynamics can’t be ignored.
It’s not just a U.S. problem, as cyber tools are borderless threats.
A blog post from the Foundation for Defense of Democracies warned that Russia, China and Iran have already exploited global protests, resource supply chains and U.S. elections through coordinated disinformation and hack-and-leak campaigns.
“America is letting its adversaries win the information war. The latest blow to U.S. efforts to fight this war came in August, when the U.S. intelligence community learned of the reduction in size and reorganization of the Foreign Malign Influence Center,” it says. “This ill-advised action comes on the heels of similar reductions within the State Department, the Department of Homeland Security and the FBI.”
Solving today’s information operations problem “requires a meaningful place for creatives, for musicians [and] for people outside of government,” Feldman-Piltch said. “It requires the citizenry to be engaged and supportive of their own nation. The best way to keep that from happening is to divide them and degrade them.”
“So you could have all the people and all the money in the world in government focused on this, and it wouldn’t be enough, because that’s not how this works,” she added. “We have to tell the truth better than they lie.”




