Complex, unstructured info can lead to unexpected efficiencies, experts say.
Big data has the power to change scientific research from a hypothesis-driven field to one that’s data-driven, Farnam Jahanian, chief of the National Science Foundation’s Computer and Information Science and Engineering Directorate, said Wednesday.
Reaching that point, however, will require upfront investment from government and the private sector to build infrastructure for data analysis and new collaboration tools, Jahanian said. He was speaking at a big data briefing for congressional staff hosted by the industry group TechAmerica.
The term big data refers generally to the mass of new information created by the Internet and by scientific tools such as the Hubble Telescope and the Large Hadron Collider. The emerging field of big data analysis is aimed at sorting through the massive volume of that data -- whether it’s social media posts, video clips, satellite feeds or the reaction of accelerated particles -- to gather intelligence and spot new patterns.
Federal officials announced in March that the government will invest $200 million in research grants and infrastructure building for big data.
The investment was spawned by a June 2011 report from the President's Council of Advisors on Science and Technology, which found a gap in the private sector's investment in basic research and development for big data.
The research firm Gartner predicted in December 2011 that 85 percent of Fortune 500 firms will be unprepared to leverage big data for a competitive advantage by 2015.
Big data analytics also has the potential to improve government efficiency, panelists at the TechAmerica event said.
The Centers for Medicare and Medicaid Services, for example, could pull data from insurance reports, hospital forms and anonymized data from electronic medical records to get a much better understanding of which medications and procedures are most effective, said Caron Kogan, a strategic planning director at Lockheed Martin Corp.
In addition, the Defense Department could gather better data on the expected life cycle of its equipment so that it could replace equipment before it fails, drastically cutting down its supply chain costs.
“[Some of] these are old concepts,” Kogan said, “but now you have more data so predictability is increased. You’re not working with a sample size, you’re working with all this data to assess which parts might fail.”
Big data also has the potential to point out new patterns or opportunities for efficiency that officials may never have imagined, said Bill Perlowitz, chief technology officer of Wyle science, technology and engineering group.
“In hypothetical science, you propose a hypothesis, you go out and gather data and you see if your hypothesis is supported,” Perlowitz said. “That limits your exploration to what you can imagine. It also limits the number of relationships you can explore because the human mind can only go so far.
“The shift with data-driven science and big data,” he said, “is that first we collect the data and then we see what it tells us. We don’t have a pretense that we understand what those relationships are, or what information we may find.”