Government should set an example for the private sector in cybersecurity and resilience, Inglis says

Chris Ingls speaks at a Council on Foreign Relations event in April 2022.

Chris Ingls speaks at a Council on Foreign Relations event in April 2022. Drew Angerer/Getty Images

Former National Cyber Director Chris Inglis discussed how a lighter regulatory approach is crucial to keep emerging technology adaptable, particularly for cybersecurity and AI.

A more flexible regulatory approach in the U.S. government’s emerging technology policy will be key to helping secure the country’s cyber infrastructure amid ongoing digital threats and new generative systems, alongside fostering deeper partnerships with the private sector, according to Chris Inglis, the former national cyber director 

Talking about the evolving relationship between the public and private sectors as both work to fend off increasingly severe cyberattacks, Inglis — the first to hold the national cyber director position from June 2021 to February of this year — dispelled the myth that such sectors are fighting similar but ultimately separate fights. 

“They're actually trying to solve the same problem against the same adversaries, and each of them has unique capabilities, perspective and authorities,” he told Nextgov/FCW

Inglis, who now advises firms Paladin Capital, Securonix, and Semperis, said that within the Office of the National Cyber Director, the goal is to make regulation of technological and cyberspace issues more coherent, without stifling beneficial innovation or the rights of private corporations. Striking this balance between imposing legal mandates and “self-enlightenment” at companies will be a matter of incentivization. 

“There are companies who, under market forces, are kind of seeing the light, and they're beginning to invest in safety, resilient security,” he said. “But that will not take us far enough. The voluntary and market approach — or market-based approach — always lines up a little bit short because markets have a tendency to kind of wax and wane.”

The light-touch regulatory approach Inglis advocates is underpinned by having federal agencies lead the way in smarter digital infrastructure management.

“Whatever we tell others they should do to create resilience and a proper defense of their digital infrastructure, the government should do first,” he said. 

On the emerging regulatory concerns with artificial intelligence, Inglis likewise asserted that before federal mandates for how to use and deploy AI systems are codified, foundational principles — such as maintaining a human-centric approach to developing AI technologies — need to be understood. 

“We cannot afford to run the risk of regulating the thing that we know at the moment,” he said. “This generative AI is an exemplar of the nature of technology at this moment in time; it moves so fast and surprises us at every turn.”

Ensuring a human authority has a thorough understanding of what a given AI system is authorized to do is one pillar of Inglis’s suggested principles. Maintaining a human user in an AI technology’s operation ensures a control mechanism capable of countering system processes is still in place. 

“We need to make sure that we double down on the concept that we're not trying to create autonomous technology,” he said. “We're trying to create automated technology that supports and remains connected to the human being.”