recommended reading

Government data goes public

Caitlin Fairchild/

When the real estate sites Trulia and Zillow both made initial public offerings this summer it was more than just a victory for the two companies, federal Chief Information Officer Steven VanRoekel said Tuesday.

It was also a victory for the government data that underlies those two sites, he said during a keynote presentation at the National Institute of Standards and Technology’s cloud computing and big data forum and workshop.

Much of the excitement about so-called big data in government has focused on ways federal agencies can use powerful new data analysis tools to gather information about national security threats or to spot fraud in Medicare and contract spending.

VanRoekel’s office has also been working, though, on the role government-gathered data -- aggregated into machine readable streams on the website and elsewhere -- can play in the emerging information economy.

As Trulia, Zillow and other real estate sites grow, for example, they could offer home buyers troves of government-originated information that’s usually either not available to consumers or would take a lot of digging, he said.

“We have government data that will tell you: Is there an organic farm within 25 miles of here? What’s the crime situation from a state and local perspective? What’s the environmental quality of this area? What’s the quality of the medical care? What’s the quality of the schools? What’s the quality of the infrastructure near you? What are the broadband speeds available to you?” he said.

More established companies also are increasingly using government data streams. Google, for instance, is now displaying data from the Food and Drug Administration and National Library of Medicine in a pullout box when people search drug names, federal Chief Technology Officer Todd Park pointed out earlier this month.

Government real estate, health, energy and environmental data could prove as valuable to industry and entrepreneurs in the future as the government’s weather satellite and Global Positioning System data did in the past, VanRoekel said. Both of those spawned multi-billion dollar industries.

Tuesday’s event focused largely on gaps in the government’s ability to take advantage of big data. Agencies need more commercially-built but government-vetted tools to sort through and analyze unstructured data, panelists said. They also need more young workers with advanced degrees in data analysis, they said.

The government invested $200 million in big data research in March to make up for what it identified as gaps in commercial support. 

Threatwatch Alert

Stolen credentials

Hackers Steal $31M from Russian Central Bank

See threatwatch report


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.