Are Agencies Making the Grade in Data Center Consolidation?

Oleksiy Mark/Shutterstock.com

Four years after the governmentwide effort to consolidate data centers, the number of data centers either remains the same or has spiked, a survey finds.

A survey released today suggests a majority of federal agencies aren’t making the cut when it comes to the Federal Data Center Consolidation Initiative, largely because of the government’s penchant for duplicating data.

The MeriTalk survey polled 150 federal IT decision makers and found just more than half would grade their agency’s efforts to adhere to the FDCCI as a “C or below,” with only 6 percent doling out an “A” grade.

Respondents reported 40 percent of federal data assets are stored “four or more times,” and as datasets get larger, the redundancies require increasingly large amounts of storage and pose data governance issues as well. The survey indicates one in four agencies use between 50 and 88 percent of their data storage capacities to store duplicative or nonprimary data, and that’s driving up storage costs.

The survey posits 27 percent of the average agency’s storage budget was allocated toward nonprimary data last year, or the federal equivalent of $2.7 billion. That number will jump to 31 percent in fiscal year 2015, costing $3.1 billion, and carries out to $16.5 billion over the next 10 years.

The survey suggests large inroads to curbing data copies lie in better management plans. For example, only one in three respondents said their agencies varied the number of copies of datasets based on their significance or the likelihood it would be used again. This kind of management is more commonplace in the profit-driven private sector.

“We’ve seen the dramatic impact of a more holistic approached to copy data management in the private sector for years now,” said Ash Ashutosh, founder and CEO of Actifio.

Ashutosh suggested copy data virtualization as a potential solution to some of the government’s duplicative data-collecting efforts, and suggested it might help the government make good on the FDCCI.

“Frankly, I’m not surprised by the magnitude of the potential savings at the federal level, or that this has now come to light as a significant barrier to FDCCI,” he said. “Copy data virtualization is today where server virtualization was 10 years ago. We’re thrilled it’s now been identified as a strategy that can dramatically accelerate the process of data center consolidation, and get FDCCI back on track.”

Despite the FDCCI, which launched in 2010, 72 percent of respondents reported the number of data centers in their agency either stayed the same or increased over the past four years. The Government Accountability Office has been clear that the Office and Management and Budget needs to strengthen its FDCCI oversight, but four years into the initiative, the government has hard time even tabulating how many data centers it possesses.

(Image via Oleksiy Mar/Shutterstock.com)