Ninety percent of open data experts interviewed in a new report believe the standardization and publication of government data have improved over the last few years of the Obama administration.
The report, released jointly today by the Data Foundation, an open data research organization based in Washington, D.C., and consulting giant Grant Thornton, includes a history of U.S. open data efforts and detailed feedback from more than 40 data transparency experts within and outside government.
The report credits the Obama administration’s “strong interest” in open data—including issuing the Open Government Directive in 2009, followed four years later by an executive order to make open data machine readable—but its assessment is tempered by challenges in standardizing data.
» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.
Data.gov, for example, boasts some 200,000 data sets from agencies as diverse as NASA, the U.S. Census Bureau and the National Oceanic and Atmospheric Administration. But it’s one thing to put the data out there for the public to consume or to potentially spark new industries; it’s another to standardize those data sets.
“The rate at which open data has been adopted by government at all levels exceeds substantially the rate at which the practices and standards relating to open data have matured,” said Waldo Jaquith, an open data expert at 18F, in the report. “There’s no central data set of government data repositories and their inventories, and no standard way to locate them.”
Jaquith’s comments outline what appear to be a large hurdle in the race to open the floodgates of government data to the masses. The Digital Accountability and Transparency Act—enacted in 2014—directs agencies to report standardized versions of their financial, budget, grant and contract information to the Treasury Department and the Office of Management and Budget. Those two agencies, then, must publish a single open data set detailing the executive branch’s spending.
The open data experts interviewed in the report expect the DATA Act to have large ramifications to federal management, but it won’t be a cure-all for standardization issues at large. For one, implementing the law has been difficult: Lawmakers and the Government Accountability Office have questioned whether agencies would meet the May 2017 deadline.
“Currently, open data practices are appallingly crude,” Jaquith said. “A list of 100 core types of government data would find that for more than 90, there exists no standard schema. But even generating that list would be a feat, because there’s nothing approaching agreement on what the 100 core types of government data are.”
In any case, the report makes clear the importance open data could play in the near future. A government built on pen and paper won’t cut it when citizens expect access to government services and information the same way they order products or interact with their banks.
"Investing in civic technology and committing to government transparency are key to a modern democracy,” Rep. Seth Moulton , D-Mass. “In 2016, citizens expect and deserve access to information about the laws that impact them on a daily basis."
And as Beth Blauer, founder at the Johns Hopkins University Center for Government Excellence, noted, government agencies internally have a lot to gain from sharing data better.
“Government agencies are themselves the prime users of government data,” Blauer stated. “Data should be recognized as a strategic asset, and therefore a data strategy connected to real goals and outcomes is critical.”