Why Don’t We See More Automation in Federal Networks?

Alexander Kirch/Shutterstock.com

Automation would cut the time from detection of cyberattacks to remediation from months to seconds.

John Breeden II is an award-winning journalist and reviewer with over 20 years of experience covering technology and government. He is currently the CEO of the Tech Writers Bureau, a group that creates technological thought leadership content for organizations of all sizes. Twitter: @LabGuys

Over the past few months, I was fortunate enough to be asked to evaluate several cutting-edge technologies designed to make government networks more secure. Some of these were more advanced than others, and a few were hindered by newer technologies like cloud computing. But they all showed a great deal of promise for the federal government if deployed correctly.

One of the most interesting possibilities is creating an event-driven architecture to add automation to the federal defensive arsenal. Given a single router can generate over 100,000 data points every few seconds, any network of any size quickly grows beyond the ability for even teams of humans to protect 100 percent effectively. There is just too much data and not enough analysts.

» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.

Attackers know this, and use all that data as cover to remain undetected once they breach a network. That is why the latest Mandiant M-Trends 2016 Report found most organizations were breached for 146 days before the successful attack was discovered. The government is no exception to this rule.

Automation could be the answer, reducing the time from detection to remediation from months to seconds. The basic concept is simple enough. It uses the power of the network itself to counter threats, making it a machine versus machine affair. That’s not unlike the classic "WarGames" movie, where a young Matthew Broderick gets the WOPR computer to play itself in a game to teach it futility.

The concept of automation in cybersecurity can be broken down into three basic levels. At the first and most-basic level is human-driven automation. A human operator needs to do something, like check a series of network devices for compliance issues, so they activate a script to do the heavy lifting. This can cut down on operator workload and help with odorous chores like patch management, but doesn’t improve breach response times.

At the second level, which makes the most sense for federal agencies, there is event-driven automation. At level two, humans “teach” computers their various processes. If a computer goes down, they open a trouble ticket, or if a virus is detected, they wipe the system and restore the core operating system.

Humans set those event triggers and program what responses to automatically take. Then, they can remove themselves from the loop, though they can also keep a hand in things, such as having a computer notify a supervisor about a particularly dangerous trigger.

Computers are never actually doing anything beyond what they are taught, but can respond to security events at machine speed, automating the remediation of many threats, especially low-level ones, and freeing up analysts to work on larger projects or trickier situations.

The final level is almost science fiction at this point, though there have been glimpses of what could one day be possible in things like IBM’s Watson and Google’s AlphaGo software. At that level, computers still respond to events, but also program their own triggers and responses, possibility making processes even more efficient than the original human-driven plan.

So why don’t we see more automation in federal networks, even at level two?

The answer is to get there requires both hardware and software. The software is available, but you also really need to have event-driven hardware in your network to take advantage of all of automation’s benefits. That is ready too, but installing it piece by piece could be a slow process. In an event-driven network, devices should be built so they can interface with one another to open the doorway to true automation. Specifically, they should all have:

  • Device Configuration in Structured Formats. Devices should be able to be programmed using a standard and common interface across the enterprise.
  • On-device APIs. Especially critical for feds, each device needs to run scrips natively onboard the device. Feds using a FIPS 140-2 device running on a closed network will need this so automation scrips can run completely onboard without breaking tight standards compliance.
  • Full Configuration Rollback. If an automation script is not working as intended, any device in an event-driven architecture needs to easily rollback to a previous configuration. This will keep experimenting with new scrips from becoming a liability.
  • Support for industry-standard models. For ease-of-use, each device should support industry standard configuration models like IETF or OpenConfig.

Once the hardware is in place, and several companies do offer automation-ready gear, the triggers and responses can be programmed to help fight cybersecurity threats at machine speed. The computers can do everything an analyst does without getting tired, hungry or bored.

Beyond just cybersecurity, having an event-driven architecture in place also opens new efficiencies. Automation can, for example, be used in data centers for the automatic provisioning of software-defined networks based on customer needs, establishing micro-segments or automating the application of services by applying service-chaining.

There are some impressive capabilities in this field, but the first benefits of automation for most agencies will most certainly be in cybersecurity. Especially now with a critical shortage of analysts and the government not hiring anyone new, technologies like automation need to be quickly deployed before agencies start to get steamrolled under the next wave of advanced attacks.