recommended reading

Policy would require agencies to scan for network threats every 72 hours and begin patching holes

kentoh/Shutterstock.com

The original headline on this story was modified for clarification.

The Homeland Security Department later this month will present to federal computer contractors and remote cloud suppliers standards for finding and fixing cyber threats within 72 hours, DHS officials announced on Thursday.

The new approach aims to resolve what some cybersecurity specialists view as a flaw with the principle of automated “continuous monitoring” that the White House called for in 2010. Real-time tracking of potential network threats is intended to identify weaknesses faster and more economically than the old policy of manually reporting on computer inventories and incidents once a year. But spotting all the risks to personal computers and Internet connections in an organization does not make data any safer, critics note. Fixing them quickly does.

Resolving identified weaknesses rapidly is the goal of the new procedures and, according to some government advisers, agencies could eventually be required to adopt them. “We’re initiating the discussion and we are asking for comment,” DHS National Cybersecurity Division Director John Streufert told Nextgov on Thursday.

Homeland Security officials will describe the standards in-depth to industry officials June 25-26, Streufert said earlier in the day during a talk co-hosted by Government Executive Media Group, which includes Nextgov. He spearheaded the original continuous monitoring movement as the former chief information security officer for the State Department.

“Think continuous monitoring and mitigation,” said SANS Institute Research Director Alan Paller, who added that the term “continuous monitoring” has been misinterpreted in practice. “Knowing [of a weakness] and not fixing it is dangerous.” SANS, a computer security think tank, co-sponsored the event.

Streufert said the mechanics also will help shape rapid response in the cloud, where agencies do not always have physical control over their data. The standards are not a mandate, he said, but rather a template for how to tailor procedures to every sort of computing environment from federal data centers to corporate offices.

The formulation of the continuous fixing method follows the June 6 launch of the Federal Risk and Authorization Management Program for certifying the security of outsourced computer centers. FedRAMP is a process in which third-party auditors initially check that a cloud service meets a set of uniform, governmentwide security controls, and then offer the resulting accreditation documentation to all agencies for free reuse. Any department can then install the authorized service as long as the cloud vendor continuously monitors those controls for their federal customers.

“Every agency is still fully responsible for the security of their operating environment,” Dave McClure, an associate administrator at the General Services Administration, which manages FedRAMP, said during the talk. The procedures for automated, continuous reporting to agencies are still a work in progress, Streufert said.

McClure said the government, as of June 13, had received 22 applications from cloud companies for certification. FedRAMP is obligatory for all Web services that will have a limited or serious impact on government operations if disrupted. The Office of Management and Budget is responsible for enforcing the program, which McClure said he expects will happen each year when agencies file reports on compliance with federal computer security law. “I think they’ve got some leverage,” he said, without going into details.

Threatwatch Alert

Network intrusion / Stolen credentials

85M User Accounts Compromised from Video-sharing Site Dailymotion

See threatwatch report

JOIN THE DISCUSSION

Close [ x ] More from Nextgov
 
 

Thank you for subscribing to newsletters from Nextgov.com.
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download
  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

    Download
  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

    Download
  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

    Download

When you download a report, your information may be shared with the underwriters of that document.