recommended reading

Commentary: Head count isn’t the answer for cyber defense

It is reassuring that the Pentagon, the nation’s leading cyber defense agency, has announced plans to beef up its cyber protection capabilities. Officials would expand U.S. Cyber Command into three teams to focus on privately owned critical infrastructure, military operations and Defense Department networks. It is questionable, however, whether throwing head count at the problem will address what is really a big security data challenge.

Unfortunately, relying on manual processes to comb through mountains of logs is one of the main reasons critical issues are not being addressed in a timely fashion. According to Verizon’s 2012 Data Breach Investigations Report, 92 percent of breaches were discovered by a third party and not through internal resources.

The ultimate goal is to shorten the window attackers have to exploit a software or network configuration flaw. Big data sets can help put specific behavior into context, but there are some real technological challenges to overcome. A March 2012 report by technology research firm Gartner puts the magnitude of the problem in perspective. “The amount of data required for information security to effectively detect advanced attacks and, at the same time, support new business initiatives will grow rapidly over the next five years,” the report said. “The amount of data analyzed by enterprise information security organizations will double every year through 2016. By 2016, 40 percent of enterprises will actively analyze at least 10 terabytes of data for information security intelligence, up from less than 3 percent in 2011.”

A continuous monitoring approach to protecting data is recommended by the National Institute of Standards and Technology. It has become a mandate in the government sector. But it only adds to the big security data conundrum because increasing the frequency of scans and reporting exponentially increases data volumes. This raises the question: How can the Pentagon and other organizations take advantage of big security data without having to hire a legion of new employees?

While security monitoring generates big data, in its raw form it’s only a means to an end. Ultimately, information security decision-making should be based on prioritized, actionable insight derived from that data. Big security data must be correlated based on its criticality or risk to an organization. Without a risk-based approach to security, organizations can waste valuable information technology resources mitigating vulnerabilities that in reality pose little or no threat. Big security data has to be filtered to just the information that is relevant to specific stakeholders’ roles and responsibilities. Not everyone has the same needs and objectives when it comes to leveraging big data.

To deal with big security data and achieve continuous monitoring, the Pentagon and others must use technology like information security risk management systems to automate manual, labor-intensive tasks. ISRM systems make threats and vulnerabilities visible and actionable, while prioritizing high-risk conditions and allowing organizations to address them before breaches occur.

Torsten George is vice president of worldwide marketing, products and support for Agiliance, an IT security risk management firm.

Threatwatch Alert

Network intrusion / Spear-phishing

Researchers: Bank-Targeting Malware Sales Rise in Dark Web Markets

See threatwatch report


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.