recommended reading

Treasury, Recovery Board Officials Embrace Efforts to Standardize Federal Data

Dmitry Strizhakov/Shutterstock.com

It’s time for the government “to come into this century” and recast agency spending data so that it’s standardized and interoperable, a key Recovery Board official told an industry breakfast on Thursday.

Nancy DiPaolo, the independent board’s chief of congressional and intergovernmental affairs, said “it was so exciting” when the House on Nov. 18 nearly unanimously passed the Data Accountability and Transparency Act that would codify the standardization movement, following approval of a similar bill by the Senate Homeland Security and Governmental Affairs Committee.

But neither the Recovery Board nor the Treasury Department—the two agencies spearheading the movement toward open and consistently formatted federal data—are waiting on the bill to begin embracing the effort, said Hudson Hollister, executive director of the industry and nonprofit group called Data Transparency Coalition, which sponsored the event.

Marcel Jemio, chief data architect of Treasury’s Fiscal Service and a veteran of the private sector, said, “The importance of data is like the new black. It used to be considered a back-end thing, as in let the IT guys do it. But there’s been a mind shift in how it’s thought of. We’re not just talking techies, but how data drives business.”

Getting the government’s data in sync with the business world would be a “wedge” in restoring the public’s fallen trust in government, Jemio added, in addition to boosting economic growth and improving management, though, he admits, openness makes government vulnerable. “I know what it takes to get to the C-word in this town, consensus, to work collaboratively,” he said. “The future is bright for standardization, and leaders must embrace the uncertainty.”

Marisa Schmader, a project support division director at Treasury, said her agency has embarked on a “research and development effort, a prototype in intelligent data,” which, like the familiar consumer barcode, can provide financial products with “context and information that makes it open and accessible in meaningful ways.” She said the current budget crunch is “an opportunity for standardization.”

Christina Ho, executive director for data transparency at Treasury’s Fiscal Service, described how Treasury has assumed more responsibility for cross-government leadership on data, such as running the USASpending website. “I don’t think we’re good at implementing things across the board,” she said, in part because “we can’t make it mandatory—we have to build a collaborative forum.”

The Recovery Board, since its founding in 2009, has developed “a unique ability to standardize and analyze large volumes of data,” said Executive Director Ross Bezark, noting that the board’s authority was recently extended for two more years following the emergency aid spending package for victims of Hurricane Sandy. “We show that government can provide data to the public as well as for use by experts and program managers,” a need that was discussed at least 50 years ago when President Lyndon Johnson’s team tackled issues around automated data processing, he said. ‘Introducing consistency will allow better transparency and oversight while saving money, and give government a better ability to prevent fraud or at least interrupt it sooner.”

DiPaolo added that the board has learned good lessons by collaborating with multi-agency inspectors general and with state governments, noting that governors brought pressure for the federal government to track where spending actually goes rather than merely “chucking money and walking away,” she said. “We created Recovery.gov, but 55 states and territories each created their own websites to provide accountability and give people a say in government.”

Forced to do more with less under sequestration, she added, “Federal agencies want to spend more on their missions rather than dumping money into old systems that require them to report tons of data and not know what to do with it. This is not a new reporting requirement.”

Passage of the DATA Act is not considered a sure thing, given some disagreements on the role of the Office of Management and Budget and the centralization of data. The Congressional Budget Office on Wednesday released a score of the Senate version of the DATA Act, saying implementation would cost $300 million from 2014 to 2018.

(Image via Dmitry Strizhakov/Shutterstock.com)

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats

JOIN THE DISCUSSION

Close [ x ] More from Nextgov
 
 

Thank you for subscribing to newsletters from Nextgov.com.
We think these reports might interest you:

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

    Download
  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

    Download
  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

    Download
  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

    Download
  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

    Download
  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

    Download

When you download a report, your information may be shared with the underwriters of that document.