Any discussion of cybersecurity probably should begin by asking how our risk is unique compared to that of other nations'. The short answer is that in many countries, the government owns and controls the telecommunications industry and related cyber infrastructure. This is true to such an extent that when you pick up the phone, use a cellphone or a wireless service, transmit a fax, access the Internet, or send and receive email overseas, you're likely sending the communication to that nation's internal security or intelligence service.
Our tradition of private enterprise and limited government could make us more vulnerable to cyber threats. It also limits the federal government's ability to influence key security aspects of our cyber and other critical infrastructure.
Our gradual and delayed discovery of this reality was painful to experience, and predates many of our concerns about cybersecurity. Recall that with the many new telecommunications companies and business models emerging in the 1980s and 1990s, our intelligence and law enforcement agencies, which had long-term relationships with "Ma Bell" and AT&T, had to introduce themselves to the new companies. This was sometimes frustrating.
For example, when presented with judicial warrants, many of these new companies simply didn't know what to do with them, or they lacked the technical capability to comply with the orders.
This had to be fixed; otherwise, the breakup of AT&T would have meant a failure of the nation's law enforcement and intelligence agencies to use the limited authorities they had. The answer to this conundrum, indirect and cumbersome as it was, came in 1994. It is called the Communications Assistance for Law Enforcement Act. CALEA basically did three things: It stated that the new telecoms had to comply with warrants and other official requests for information by law enforcement and intelligence authorities.
It set aside millions of dollars for the new telecoms so that they could build the capabilities necessary to comply with warrants and other official requests. And finally, it allowed the new firms to contract out these key responsibilities to so-called trusted third-party providers (this part was added in 2007, when the law was expanded to include broadband carriers).
The CALEA approach has worked so well that we apparently are going to do it again for all our broadband and cell service providers, which suggests our government's ability to influence the security of much of our private cyber infrastructure might be limited to throwing money and regulations at the problem. What's more, this approach isn't likely to assure uniform implementation or even compliance.
The Human Factor
If we took everything we know, especially what we have learned the hard way about spies and espionage, then multiplied it by a million, billion or even a trillion (the newest measure of government spending) we might come close to the counterintelligence risk presented from cyber technology.
Any and every secret we have in digital format can literally go viral in a microsecond. The potential damage is truly catastrophic.
Perhaps the best example of this risk is the WikiLeaks dump of classified State Department diplomatic cables. All were reportedly copied by a lone Army private in a matter of seconds, and apparently without anyone the wiser that there had been a data breach. Unfortunately, we are probably going to see more and more of this kind of thing. In short, imagine the damage dedicated spies like the CIA's Aldrich Ames and the FBI's Robert Hanssen could have done as cyber spies.
Any meaningful cybersecurity framework absolutely must have a well thought out counterintelligence annex. This should apply whether a military, civilian or intelligence agency is responsible for a particular activity or mission.
This insider threat can't be emphasized too much: Probably the largest short-term risk to cybersecurity is human: people motivated by traditional reasons for espionage or sabotage who access and pass information, or who weaken, damage, infect or disable cyber systems. In the final analysis, truly effective cybersecurity depends on truly reliable people.
Another significant challenge for federal agencies is that members of Congress are schizophrenic about many intelligence activities: They want to know about them, but they also sometimes want to deny knowledge of those activities after the fact, especially when it becomes politically expedient.
Congressional notification policies for cyber have to address everyone's equities. Also, Congress is as territorial as the executive branch, in that every committee with even a small interest in cybersecurity has drafted--and will continue to draft--legislation that applies to the whole of government, whether it makes any sense to do it or not. Congress can't help itself.
For example, having first created the Homeland Security Department in 2002 and the Office of the Director of National Intelligence in 2004--both against the desire of then-president George W. Bush and much of the executive branch--Congress will continue to enable and obligate these new entities with critical cybersecurity roles, missions and responsibilities, many of which they simply cannot and will not be able to perform for some time. This, in itself, extends the period of vulnerability we have for basic cybersecurity, primarily because of the learning curve required, and will be a dangerous period for us, whether it's acknowledged or not.
Because the National Security Agency is part of the Defense Department and there are long-standing policy objections to NSA assuming broad cybersecurity roles and responsibilities, there are massive and very expensive efforts under way to duplicate NSA capabilities in other agencies, including law enforcement agencies. Such a transition, duplication or transfer (however it's described) creates both a period of danger and an engraved invitation for a cybersecurity disaster. Accordingly, this process should be closely monitored by an independent and outside red team of technical and legal experts. In other words, this is way too important to screw up, yet it's almost bound to happen unless the process is very closely managed.
All the traditional ghosts in the intelligence collection business (and even some new ones) will also walk in the realm of cybersecurity. For example, most collection management principles and issues apply to information obtained by cyber means, as does the management of collection assets, whether human or technical. And obviously, cybersecurity issues will become increasingly important in "covert actions" and "special activities" as defined by public law and executive order, as well as military "special operations," including "information operations" and "public diplomacy," however these activities are defined or managed.
We must learn from past efforts to manage new categories of information. For example, when we entered the nuclear age in the 1940s and 1950s, we created, by statute, so-called restricted data and other subcategories for sensitive atomic energy-related information. Looking back on this, it probably caused more trouble than it was worth. It has also figured prominently in the mismanagement and compromise of some extremely sensitive information from time to time. Lesson: Let's not do this again in cybersecurity.
Finally, we must establish policies and rules for embedded technical capabilities and other inadvertent collections. Why? Throughout my federal service, some of the scariest situations I encountered involved the following: An "operator" with a military mission to destroy or disable something learns that he can also control it and learn things from it (the "operator" becomes a "collector"); or a passive "collector" sees that he has "active" capabilities against the system he is targeting.
You get the idea here: If there ever was an activity that amplified both the probability and significance of these kinds of security anomalies, it is the various cyber missions, passive or active, offensive or defensive, sensitive or routine. We need to have comprehensive plans to deal with them, as well as organized and thoughtful ways to take advantage of them and protect ourselves from exploitation and sabotage.
Where to Start
We had little choice but to start where we did with the organizations and laws we have, but we need many new or revised plans, policies and rules to guide us. And clearly, some of these, especially the ones that could possibly involve or affect citizens, should be approved and promulgated by the attorney general.
We also may discover some advantages to having a hodge-podge of dispersed private cyber infrastructure and we should think about this aspect of our risk when we contemplate structural changes. Perhaps most important, however, when we want to know where to start and what to do first, we need an accurate and objective self assessment, which includes identifying what and where our most serious vulnerabilities are.
The key recommendation here is that we should enable operational test teams (assembled from the NSA, the new U.S. Cyber Command, Homeland Security and the FBI) to actively probe our cyber infrastructure, both public and private, especially our dot-gov and internal "secure" systems, as well as our Internet nodes and service providers. These activities should be done primarily to identify our vulnerabilities and mitigate the risk.
This recommendation should be done in conjunction with enabling legislation, primarily because these teams could acquire information about U.S. citizens during such operations. For the limited purpose of assessing our nation's cyber vulnerabilities and then taking corrective action, however, these activities would be legal and politically defensible. In fact, existing limitations on the retention of inadvertently collected information should be adequate.
Such testing of our critical systems is not a new idea, and should be expanded to help guarantee our basic cybersecurity and help us assure our privacy at the same time. This is because privacy is inextricably linked to the security of the cyber systems upon which we depend.
Another key recommendation is to give someone or some agency the authority to shut down or interrupt various critical cyber systems, including the Internet.
This is nothing new, at least conceptually, and could mirror similar authorities held by agencies or officials during national emergencies. Such authority should be subject to approval by the attorney general or the president and require reporting to congressional leaders. It would be extremely limited, and could apply to an enumerated list of cyber threats.
I hope at least some of these recommendations are under way. I know many are not, however, because they require political will as well as amendments to the legal and regulatory framework that governs national security activities and intelligence operations.
In addition, we need to remember that state and local laws generally have fewer constraints on these kinds of activities. In fact, state and local legal regimes generally do not distinguish between law enforcement and intelligence activities, nor are there significant distinctions between the various uses for information collected or otherwise obtained by state or local authorities. This is an advantage that should not be ignored in the overall management of cybersecurity.
In the final analysis, while many cybersecurity issues are truly new and emerging, many others should evoke familiar (and perhaps painful) lessons from the not too distant past.
Most important, let's honestly assess the gravity of the threats against us and test our critical cyber systems by putting them under closely managed stress. It's probably the only taste of cyberwarfare reality we can give ourselves before someone or some nation or organization with malevolent motives shuts us down and watches us squirm.%08
Daniel Gallington was a member of the Senior Executive Service and deputy counsel in the Office of Intelligence Policy and Review at the Justice Department. He also served as general counsel for the Senate Select Committee on Intelligence and deputy assistant secretary of Defense for territorial security.