recommended reading

Big data could remake science -- and government

kentoh /

Big data has the power to change scientific research from a hypothesis-driven field to one that’s data-driven, Farnam Jahanian, chief of the National Science Foundation’s Computer and Information Science and Engineering Directorate, said Wednesday.

Reaching that point, however, will require upfront investment from government and the private sector to build infrastructure for data analysis and new collaboration tools, Jahanian said. He was speaking at a big data briefing for congressional staff hosted by the industry group TechAmerica.

The term big data refers generally to the mass of new information created by the Internet and by scientific tools such as the Hubble Telescope and the Large Hadron Collider. The emerging field of big data analysis is aimed at sorting through the massive volume of that data -- whether it’s social media posts, video clips, satellite feeds or the reaction of accelerated particles -- to gather intelligence and spot new patterns.

Federal officials announced in March that the government will invest $200 million in research grants and infrastructure building for big data.

The investment was spawned by a June 2011 report from the President's Council of Advisors on Science and Technology, which found a gap in the private sector's investment in basic research and development for big data.

The research firm Gartner predicted in December 2011 that 85 percent of Fortune 500 firms will be unprepared to leverage big data for a competitive advantage by 2015.

Big data analytics also has the potential to improve government efficiency, panelists at the TechAmerica event said.

The Centers for Medicare and Medicaid Services, for example, could pull data from insurance reports, hospital forms and anonymized data from electronic medical records to get a much better understanding of which medications and procedures are most effective, said Caron Kogan, a strategic planning director at Lockheed Martin Corp.

In addition, the Defense Department could gather better data on the expected life cycle of its equipment so that it could replace equipment before it fails, drastically cutting down its supply chain costs.

“[Some of] these are old concepts,” Kogan said, “but now you have more data so predictability is increased. You’re not working with a sample size, you’re working with all this data to assess which parts might fail.”

Big data also has the potential to point out new patterns or opportunities for efficiency that officials may never have imagined, said Bill Perlowitz, chief technology officer of Wyle science, technology and engineering group.

“In hypothetical science, you propose a hypothesis, you go out and gather data and you see if your hypothesis is supported,” Perlowitz said. “That limits your exploration to what you can imagine. It also limits the number of relationships you can explore because the human mind can only go so far.

“The shift with data-driven science and big data,” he said, “is that first we collect the data and then we see what it tells us. We don’t have a pretense that we understand what those relationships are, or what information we may find.”

(Image via kentoh /

Threatwatch Alert

Network intrusion / Spear-phishing

Researchers: Bank-Targeting Malware Sales Rise in Dark Web Markets

See threatwatch report


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.