Big data trends taking shape

Government should follow the private sector's lead in making use of big data but focus on improving performance and efficiency "with less effort and pain," GSA's Dave McClure told a Washington audience.

"Big data" is a relatively new term to describe an old problem: Turning increasingly high mountains of data into useful information.

Speakers at a conference hosted by the AFCEA Bethesda Chapter on March 28 sought to assess the current state of technology and policy being used to corral the big data, reports Rutrell Yasin at GCN.com.

“The challenge is, what do we do with all the data that we are using? How do we sort it, analyze it and get value within the business owner’s context?” said Dave McClure, associate administrator with the General Services Administration's Office of Citizen Services and Innovative Technologies.

Government should follow the private sector's lead in making use of big data but focus on improving performance and efficiency "with less effort and pain," he said.

In another keynote at the same event, Dan Vesset, program vice president of business analytics with IDC, defined big data as "a new generation of technology and architecture designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture and/or analysis."

Big data involves multiple components involving four layers, Vesset said, including infrastructure, which includes the servers on which applications run; data organization and management, which refers to the software that process and prepares all types of data for analysis; analytics and discovery tools; and decision support software.

Read Yasin's full report on the event here.