Bureaucratic delays and 'long chains' of reviewers result in public access to outdated information about government tech projects.
In the eight months since its debut, the improved Federal IT Dashboard, which rates U.S. agency technology projects and posts the results on the Internet for the world to see, has not won the hearts of all IT project managers.
Putting cost and performance data "on a public website was not warmly received by every agency," said Beth Ward, head of IT program and portfolio services at the Federal Aviation Administration. "My agency wasn't fond of it either."
Managers worry about who will see the ratings their projects receive -- green for "normal," yellow for "needs attention," or red for "significant concern" -- and the amount of money they are spending on information technology. Governmentwide IT costs $80 billion a year.
Agencies worry most that low ratings will catch the eye of Congress, or investigators at the Government Accountability Office, several government technology officials said. But it turns out, Ward said, "Everybody's looking at this data," including IT companies, watchdog groups, and particularly the Office of Management and Budget, which operates the dashboard and has substantial influence over technology spending.
Poor dashboard ratings can lead to a TechStat, which is a detailed examination of a program followed by a face-to-face meeting between the agency chief information officer and the chairman of the agency's investment review board.
About 70 TechStats have been conducted so far, leading to the cancellation of at least four poorly performing programs, and the restructuring of 10 others, according to GAO. That's out of about 800 major government IT programs.
"We feel your pain," Ward told government IT program managers and vendors at a conference on capital planning and investment control. Ward is co-chairwoman of the IT Dashboard Transparency Working Group, which is looking for ways to improve the dashboard.
"Sometimes it causes fear and trepidation because the data is not always an accurate reflection" of a program's value, she said.
This isn't the first complaint about bad data on the dashboard. GAO reported last summer that some of the data regarding eight programs it examined was between two months and two years old. The dashboard is supposed to "represent near real-time performance information," GAO said.
An updated GAO evaluation of the dashboard is scheduled to be released April 15, the agency said.
The problem of outdated data hasn't been solved. A program manager should know by Friday if his program missed a major milestone on Monday, Ward said. But such data rarely if ever shows up on the dashboard that fast.
Some agencies have said before they can deliver data to the dashboard, they must send it to contractors and have government contracting officers sign off on it, according to Ward. "That could take three or four months. Four-month-old data is too old to make decisions from," she noted.
Errors can stem from mistakes in old data that was compiled before the dashboard existed and has not been cleaned up, said Eva Desiderio, a senior IT analyst at the Agriculture Department.
Other data inaccuracy is due to agency manipulation, Dominic Sale, an IT investment policy analyst at OMB, said.
Cost, performance and schedule data often go through "long chains [of reviewers at agencies], and by the time it gets to the other end, it may be how the agency wants to present the data, but it's not necessarily the truest data," he said.
Ward said her working group is searching for ways to better standardize data reporting. She asked large corporations, including Microsoft Corp. and Federal Express, how they assess the performance of their IT programs, "and they said they look at the data," she said.
The dashboard is designed to assess whether IT programs are "on track and doing what they're supposed to do," Sale said. For those that are not, the TechStat is a means to fix them or terminate them.
"We're not doing this to hurt you, to cause you pain," he told the conference audience. But the TechStat process is here to stay, and similar CyberStats and AcqStats are being introduced to assess cybersecurity and acquisition practices, he said.