recommended reading

Data center consolidation is about more than real estate

Consolidating federal data centers should be about more than just renting out floor space, Bureau of Alcohol, Tobacco, Firearms and Explosives Chief Technology Officer Noah Nason told a group of federal information technology executives and vendors Tuesday.

Agencies that have excess data center capacity, like ATF, could provide hardware, operating systems and even software as a service to agencies seeking such capacity, Nason said during a panel discussion sponsored by MeriTalk, a government IT industry group, and Juniper Networks, an IT vendor.

Software as a service is a sales model that allows agencies to pay for software based on the amount they use.

Agencies able to provide that higher level of service will be more likely to see significant cost savings, which they can plow back into other operations, Nason said. Before that can happen, though, data center managers have to figure out what services they're equipped to provide and how best to market them.

ATF has two data centers, Nason said, one of them federally-owned and the other managed by a private vendor.

By implementing more efficient practices and relying on server virtualization, ATF managers have opened up about 40 percent of the floor space in the agency's federally-owned center, he said, but they've had a tough time figuring out how to best market that space to another federal customer.

"We don't have the training to do that; we don't have the processes to do that and we don't have the experience," he said.

Federal IT officials plan to shut down or consolidate about 800 of the government's 2,100 data centers by 2015 -- a process expected to save the government $3 billion over five years. Officials anticipate saving an additional $5 billion annually by transferring federal data to more nimble cloud computing.

Some cost savings from data center consolidation are difficult to fully realize without a significant amount of work on the front end, Nason said.

For instance, compact servers require more air conditioning to stay cool, which means consolidating data centers doesn't automatically lead to a proportionate decrease in energy costs, he said.

Agencies can increase savings from consolidation by adopting green IT practices, he said, such as pumping air conditioning directly into server racks, so less cool air dissipates in empty parts of the data center.

The greatest roadblock to data center consolidation is a fear by some agencies that other agencies won't keep their data secure, Nason said.

That fear should be allayed in part by FedRAMP, a plan to create standard governmentwide requirements for cloud-based services so vendors have to be certified only once, said Ron Ross, with the National Institute of Standards and Technology.

A draft version of FedRAMP ran into criticism from vendors who complained it didn't pay enough heed to the diversity of security and computing requirements across agencies. A revised version of the program should be out within the next few months, Ross said during Tuesday's panel discussion.

Nason recommended providing incentives for IT officials to find innovative ways to consolidate federal data, such as offering 30 extra vacation days officials could carry from job to job, with extra vacation days to distribute among subordinates who worked on the project. Nason has pitched that idea to the Office of Personnel Management, he said, but so far it hasn't taken off.

Tuesday's event, titled "Fabric of the Future: Supporting the Federal Data Center Revolution," followed a survey conducted by Juniper and MeriTalk in July. That survey found only 10 percent of federal IT managers are confident the government will meet its data center consolidation goals.

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.