An event hosted by the leading trade association for major tech vendors highlighted what has so far been an impasse between government and industry on cybersecurity policy.
National Cyber Director Chris Inglis said his office is reviewing legislation that would start the process of requiring providers of critical information and communications technology to make certain security features standard in their offerings.
“When you buy a car today, you don't have to independently negotiate for an air safety bag or a seatbelt or anti-lock brakes, it comes built in,” Inglis said. “We're going to do the same thing, I'm sure, in commercial infrastructure that has a security critical, a life critical, responsibility to play.”
Inglis spoke Monday at an event hosted by the Information Technology Industry Council, or ITI, as part of his effort to engage the private sector in a collaborative approach to cybersecurity.
As demonstrated through its establishment and resourcing of the Cybersecurity and Infrastructure Security Agency, the government has relied heavily on the idea that organizations would voluntarily take measures to improve the cybersecurity of their enterprises. But the interdependence of various critical infrastructure sectors—and the potential for cascading effects when foundational information and communications technology within the ecosystem is targeted—have pushed some agencies, and members of Congress, to consider asserting their regulatory authority.
In the United Kingdom, the dynamic has led financial-sector regulators to take a more active role in overseeing cloud service providers.
“We've determined that those things that provide critical services to the public, at some point, kind of benefit from not just the enlightened self interest of companies who want to deliver a safe product,” Inglis said. “At some point in every one of those [critical industries like automobile manufacturing] we have specified the remaining features which are not discretionary. Air safety bags, seatbelts are in cars largely because they are specified as mandatory components of those automobiles.”
Inglis acknowledged it would be a lot more difficult to determine how such mandates should be applied to commercial information and communications technology, because of the breadth of their use across industry. But, he said, his office is providing counsel on proposals that are starting to do just that.
“We're working our way through that at the moment. You can see that actually kind of then in the form of the various legislative and policy kind of recommendations that are coming at us,” he said, noting most of the policy measures are in the form of proposed rules seeking advice on what counts as “truly critical.”
“I think that we're going to find that there are some non-discretionary components we will, at the end of the day, do like we have done in other industries of consequence, and specify in the minimalist way that is required, those things that must be done,” he said.
Reacting to Inglis’ comments, ITI President and CEO Jason Oxman, said that “makes good sense.” But the representative of a high-profile ITI-member company disagreed.
“Can I just say I really hate analogies?” Helen Patton, an advisory chief information security officer for Cisco said from an industry panel following Inglis’ conversation with Oxman.
The automobile analogy referencing simple but effective measures like seatbelts has long been used by advocates of regulations to improve cybersecurity, not just from the enterprise level—such as federal agencies and other critical infrastructure customers—but from the design phases that occur earlier in the supply chain. But Patton argued against its suitability for an approach to cybersecurity that insists on facilitating a subjective assessment and acceptance of risk.
“I think the problem with every analogy like that is that every individual makes a choice, whether they're going to read a food label, or wear a seatbelt, or use their brakes, or whatever the analogy is,” Patton said. “The reality is when you're trying to run a security program within an organization, you have to take that organization's risk tolerance into account. So it's good to get information out in front of folks, but it's really up to them whether or not they choose to act on it or not … not every security recommendation from a federal agency or a best practice is going to be adopted by an organization because they’ve got better things to do with their time and resources.”
Inglis drove home his point by highlighting the plight of ransomware victims across the country, many of which were caught up in supply-chain attacks, such as an incident last summer involving Kesaya, which provides IT management software for enterprises.
“We need to make sure that we allocate the responsibility across all of those, as opposed to leaving it to that poor soul at the end of the whip chain who, because no one else has brought down the risk, is at that moment in time facing up against a ransomware threat that they never thought they'd have to prepare for, that they have no basis to respond to because the infrastructure they're using isn't inherently resilient and robust,” he said. “We need to do what we've done in other domains of interest, which is to figure out what we owe each other.”