recommended reading

Policy would require agencies to scan for network threats every 72 hours and begin patching holes


The original headline on this story was modified for clarification.

The Homeland Security Department later this month will present to federal computer contractors and remote cloud suppliers standards for finding and fixing cyber threats within 72 hours, DHS officials announced on Thursday.

The new approach aims to resolve what some cybersecurity specialists view as a flaw with the principle of automated “continuous monitoring” that the White House called for in 2010. Real-time tracking of potential network threats is intended to identify weaknesses faster and more economically than the old policy of manually reporting on computer inventories and incidents once a year. But spotting all the risks to personal computers and Internet connections in an organization does not make data any safer, critics note. Fixing them quickly does.

Resolving identified weaknesses rapidly is the goal of the new procedures and, according to some government advisers, agencies could eventually be required to adopt them. “We’re initiating the discussion and we are asking for comment,” DHS National Cybersecurity Division Director John Streufert told Nextgov on Thursday.

Homeland Security officials will describe the standards in-depth to industry officials June 25-26, Streufert said earlier in the day during a talk co-hosted by Government Executive Media Group, which includes Nextgov. He spearheaded the original continuous monitoring movement as the former chief information security officer for the State Department.

“Think continuous monitoring and mitigation,” said SANS Institute Research Director Alan Paller, who added that the term “continuous monitoring” has been misinterpreted in practice. “Knowing [of a weakness] and not fixing it is dangerous.” SANS, a computer security think tank, co-sponsored the event.

Streufert said the mechanics also will help shape rapid response in the cloud, where agencies do not always have physical control over their data. The standards are not a mandate, he said, but rather a template for how to tailor procedures to every sort of computing environment from federal data centers to corporate offices.

The formulation of the continuous fixing method follows the June 6 launch of the Federal Risk and Authorization Management Program for certifying the security of outsourced computer centers. FedRAMP is a process in which third-party auditors initially check that a cloud service meets a set of uniform, governmentwide security controls, and then offer the resulting accreditation documentation to all agencies for free reuse. Any department can then install the authorized service as long as the cloud vendor continuously monitors those controls for their federal customers.

“Every agency is still fully responsible for the security of their operating environment,” Dave McClure, an associate administrator at the General Services Administration, which manages FedRAMP, said during the talk. The procedures for automated, continuous reporting to agencies are still a work in progress, Streufert said.

McClure said the government, as of June 13, had received 22 applications from cloud companies for certification. FedRAMP is obligatory for all Web services that will have a limited or serious impact on government operations if disrupted. The Office of Management and Budget is responsible for enforcing the program, which McClure said he expects will happen each year when agencies file reports on compliance with federal computer security law. “I think they’ve got some leverage,” he said, without going into details.

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Modernizing IT for Mission Success

    Surveying Federal and Defense Leaders on Priorities and Challenges at the Tactical Edge

  • Communicating Innovation in Federal Government

    Federal Government spending on ‘obsolete technology’ continues to increase. Supporting the twin pillars of improved digital service delivery for citizens on the one hand, and the increasingly optimized and flexible working practices for federal employees on the other, are neither easy nor inexpensive tasks. This whitepaper explores how federal agencies can leverage the value of existing agency technology assets while offering IT leaders the ability to implement the kind of employee productivity, citizen service improvements and security demanded by federal oversight.

  • Effective Ransomware Response

    This whitepaper provides an overview and understanding of ransomware and how to successfully combat it.

  • Forecasting Cloud's Future

    Conversations with Federal, State, and Local Technology Leaders on Cloud-Driven Digital Transformation

  • IT Transformation Trends: Flash Storage as a Strategic IT Asset

    MIT Technology Review: Flash Storage As a Strategic IT Asset For the first time in decades, IT leaders now consider all-flash storage as a strategic IT asset. IT has become a new operating model that enables self-service with high performance, density and resiliency. It also offers the self-service agility of the public cloud combined with the security, performance, and cost-effectiveness of a private cloud. Download this MIT Technology Review paper to learn more about how all-flash storage is transforming the data center.


When you download a report, your information may be shared with the underwriters of that document.