GAO raps State's computer security monitoring program

The State Department's pioneering efforts to automate the tracking of computer threats have failed to provide a holistic picture of its cybersecurity posture, federal auditors reported Monday afternoon.

In 2010, the department was at the forefront of a governmentwide shift to "continuous monitoring" of data security, Government Accountability Office auditors acknowledged. After State deployed technology to track network intrusions and other vulnerabilities, the Obama administration called for all agencies to measure compliance with protections mandated by the 2002 Federal Information Security Management Act. Previously, agency managers would submit paperwork annually to document they were following the law.

But now a GAO report finds that State's continuous monitoring program "does not provide a complete view of the information security risks to the department."

The department developed a custom service, iPost, that was supposed to monitor information technology assets linking all of State's domestic and global offices -- networking and support that costs the department about $1.2 billion a year. iPost culls data on configuration settings, antivirus scans and other vulnerabilities for each departmental "site," which may be a particular facility or a specific computer. It then scores each weakness detected based on severity and generates a composite risk grade for the location -- "A" through "F." The grades and detailed scoring are intended to help system administrators identify the gravest problems and prioritize plans for fixing them.

But the iPost service only covers computers that use Microsoft's Windows operating system, not other assets such as the roughly 5,000 routers and switches along State's network, non-Windows operating systems, firewalls, mainframes, databases and intrusion detection devices, GAO auditors said. Windows is on tens of thousands of workstations and servers at foreign and domestic locations.

GAO officials also noted that the 10 scoring elements iPost uses to measure protection levels do not include all the necessary safeguards. Missing from the surveillance list are physical and environmental protection, contingency planning and personnel security.

State officials told auditors during the review that they were considering expanding the program to encompass other devices and they may add other controls in the future.

GAO officials discovered technical problems as well. For instance, iPost tools did not always scan computers when scheduled, or they created false positives that had to be analyzed and explained. One scanner vendor failed to update its technology to detect the latest, most common vulnerabilities. And tools manufactured by different suppliers produced disparate scores that staff then had to interpret and modify.

Department officials told auditors they are addressing each of these mechanical challenges. But State officials denied that GAO's findings suggest the continuous monitoring program offers an incomplete picture of threats.

"While achieving security of all areas affecting information security, as GAO suggests is admirable, it is impossible and impracticable," State Chief Financial Officer James Millette wrote in a June letter, in response to a draft report. "The Department of State's continuous monitoring risk scoring program was established and implemented with the intent of achieving appropriate, prioritized security, tailored to the unique threats and vulnerabilities facing the department."

State officials also rejected GAO's advice that iPost name the individuals responsible for monitoring the status of each site and repairing any flaws.

"The department fails to understand the necessity for individually naming those individuals responsible for monitoring the security state and ensuring resolution of security weaknesses of Windows hosts," Millette wrote.

Gregory C. Wilshusen, GAO's director of information security issues and the report's author, replied in the final report, "When we surveyed State officials, there was confusion at the department as to who was responsible for operational units and information provided to us on who was responsible was incorrect for several units."

The Army, which is testing a similar continuous monitoring approach, realized early in the process that specific people must be assigned to oversee each asset.

"One of the lessons learned from the pilot was the need to identify who, which Army organization, is responsible for ensuring the security of IT devices identified as not meeting specific compliance standards," Michael J. Jones, chief of the emerging technologies division within the Army's CIO/G6 Cyber Directorate, recently told Nextgov.