How Should Agencies Address the Big Data Opportunity?

Maksim Kabakou/Shutterstock.com

Even with aging infrastructure and tight budgets, agencies can take advantage of the growing amount of information available.

Francessca Vasquez is group vice president of Oracle Cloud & Infrastructure Solutions

The promises and potential of big data have been discussed and analyzed ad nauseam. A greater amount of information promises to bring improvements in everything that we do—from medical research to agriculture. However, such improvements require smart analytics and advanced processing to capture valuable insights in the sea of information available.

Big data is of particular importance for government, as valuable insights from it can significantly improve the everyday life of citizens, including increasing efficiency of public transport and better managing Social Security payments.

» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.

At the same time, the federal government operates under very strict data conservation rules, maintaining huge repositories of data for decades—many still in paper form, surprisingly. Combined with the increasing amount of information gathered through internet and sensors, agencies need a way to capture, analyze and store a big variety of data efficiently and securely.

Administration’s Focus on Big Data

Realizing the potential of big data, the Obama administration launched the Big Data Research and Development Initiative in 2012 to develop big data technologies, demonstrate applications of big data, and train the next generation of data scientists. The initiative received an update in May of 2016 with an issue of a strategic plan, developed with 15 federal agencies.

A key strategy of the new plan is to “build and enhance research cyberinfrastructure that enables big data innovation in support of agency missions.” With the vast amounts of data available, the federal government realizes it needs to focus on developing “exascale computing system that integrates hardware and software capability” combined with continued investments in public and private cloud infrastructure.

Such calls for the radical modernization of government IT are understandable but need to be squared with the realities on the ground. In 2015, the federal government spent more than 75 percent of the total IT budget, nearly $61 billion, on maintaining legacy technology. According to the Government Accountability Office, many agencies still use obsolete hardware and software initially deployed over 30 years ago. These systems were never meant nor designed for the age of big data.

How Government Should Modernize to Benefit from Big Data

With the aging infrastructure, tight budgets and a focus on security, the following steps will allow agencies make the most of the growing amount of information available:

1. Pursue a Converged Infrastructure Approach

Embracing converged infrastructure should be the first priority for government agencies in the modernization effort. Such an approach offers tight integration of hardware, software and cloud capabilities in a single package, bringing down the costs of integration and management.

This enables agencies to leverage existing technology investments to ensure historic data is preserved and can be analyzed along with the newer data types.

According to Gartner, “the market for hyper-converged integrated systems will grow 79 percent to reach almost $2 billion in 2016, propelling it toward mainstream use in the next five years.” Gartner defines hyper-converged integrated systems as a platform offering shared compute and storage resources, based on software-defined storage, software-defined compute, commodity hardware and a unified management interface.

2. Invest in Data Analytics Tools

The 3 Vs of big data—volume (amount of data), variety (types of data) and velocity (speed of data coming in)—require a new set of analytics tools. While there are many solutions on the market, agencies should look for those that can integrate with existing infrastructure to leverage historic information.

Equally important, security should be front and center given the spate of attacks on government systems and the critical importance of privacy for government data.

3. Train and Recruit the Right Talent

The shortage of data analytics talent has been widely reported. In fact, starting salaries for data scientists are more than $200,000 a year now. Government agencies face a tough battle competing with the private sector for such talent and should focus recruitment on the potential social impact as a draw.

Further, agencies should leverage existing employees who have years of experience dealing with government processes by training them on new tools and approaches to data analytics, as opposed to recruiting externally.

Embracing the New Reality

It is evident the age of data is upon us. The Obama administration opened the door for federal agencies to embrace big data and improve citizen services and efficiency of operations.

Agencies have an opportunity to better serve citizens and improve efficiency with big data, but face the challenge of managing rapidly increasing amounts of data with the infrastructure and apps not designed for it. That is further complicated by the tight budgets, public scrutiny and ongoing security challenges showing no signs of subsiding.

To get the most out of big data, with a clear path to the future, agencies should consider a converged infrastructure approach that leverages current investments with an integrated software, hardware and cloud stack. Further, agencies must prioritize the right analytics tools and talent to tackle the big data.