recommended reading

Invest in big data, industry tells government


Agencies should appoint a single official to explore how new methods of data mining could enhance their mission and save costs, perhaps following the Federal Communications Commission’s model of a chief data officer, an industry group recommended Wednesday.

Officials also should consider appointing a federal chief data officer to assess the use of big data governmentwide, according to the report from the TechAmerica Foundation.

In addition, the data officer should examine barriers to big data analysis in government, such as overly complex regulations related to data privacy, the report’s authors said. This process should be aimed at developing uniform and understandable privacy regulations across government agencies not at lowering overall privacy protections, members of TechAmerica’s Federal Big Data Commission said when they introduced the report.

The commissioners also urged the Office of Management and Budget to create a formal career track focused on data analysis for information technology managers.

Big data analysis refers broadly to new computer systems that can collect, analyze and spot patterns in much larger collections of data than  machines could process a decade ago. These systems usually link multiple machines inside a computer cloud and coordinate them to work as a single device tackling a single problem.

Government agencies have been using big data analytics, mostly aimed at spotting common fraud patterns, for several years to avoid issuing Medicare and tax payments to fraudsters. Technology leaders have speculated the government ultimately will rely heavily on big data to better target grant funds and deliver services.

Among the greatest barriers to wider adoption of big data in government are low data quality, such as spreadsheets riddled with typos and inconsistent naming conventions; poor information sharing between federal agencies; and too few employees capable of effectively analyzing vast amounts of data from disparate sources, commissioners said.

TechAmerica’s report also urged the government to partner with universities to encourage more students to develop data analysis skills and to organize federal internships for new data scientists to lure them into public service.

Once big data analysis systems are up and running, commissioners said, agencies are likely to recoup their upfront investments by limiting fraud and making programs more efficient.

“Waste, fraud and abuse is a big challenge,” said Steve Mills, senior vice president at IBM and co-chairman of the commission. “With the size of the federal government, you’re talking about hundreds and hundreds of millions of dollars annually across a whole range of activities. That can be retrieved and put to better use for those who are more deserving.”

(Image via alphaspirit/

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.


When you download a report, your information may be shared with the underwriters of that document.