To meet new mandates to consolidate and make their data centers more efficient, federal agencies will need to start planning now to re-architect their data centers. They will also likely need to adopt virtualization of servers, networks and storage, more so than they have in the past.
On Aug. 1, the Office of Management and Budget (OMB) officially released the Data Center Optimization Initiative (DCOI), which supersedes the Federal Data Center Consolidation Initiative (FDCCI) that began in 2010. The DCOI lays out several metrics agencies need to meet for tiered data centers (i.e., large data center facilities), all of which are designed to boost the energy efficiency of data centers and encourage virtualization.
The question now is how will agencies actually meet these goals, which have deadlines of Sept. 30, 2018? And which technologies are they going to use to meet them?
Christian Heiter, the chief technology officer of engineering at Hitachi Data Systems Federal, says that agency IT leaders and their staff need to immediately benchmark their data centers’ existing performance, and then start planning for how they will achieve the DCOI’s goals.
Meeting Data Center Optimization Goals
To comply with DCOI, agencies will have to meet five metrics for tiered data centers:
- Install energy-metering tools in all tiered data centers to measure power consumption.
- Maintain a Power Usage Effectiveness (PUE) score of less than 1.5, but preferably less than 1.2.
- House at least four virtual servers per physical server.
- Use at least 80 percent of a tiered data center’s floor space.
- Achieve a server utilization rate of at least 65 percent.
Heiter says that he thinks agencies can meet the goals, especially the latter ones. Agency IT leaders should conduct benchmarking exercises to find out how much energy their data centers are using, and where the data is stored, he says.
That will help them know what kind of servers, storage and network components they need, he says. Then, IT leaders can start planning for how to architect their data centers to support specific departments, programs and missions within their agencies. “You understand where you want to go,” Heiter says.
With that framework in hand, agencies can start thinking about budgeting cycles and how to use IT budgets to, say, improve power distribution units in certain data centers. “You can start working with vendors to say we need to have a certain level of efficiency in here,” in the context of larger data center architecture goals.
“If that forethought is being put into the plan, then I think we are going to have tremendous success in meeting the goals for the DCOI, in my opinion,” Heiter adds.
The FDCCI’s push for agencies to embrace server virtualization was a “really great start,” Heiter says, but before DCOI came about one set of IT staff members would design an agency’s data center and then another would manage it.
“DCOI says, you are all working together toward a common goal,” which will push agencies to conduct more planning up front, Heiter says. “You all have a responsibility toward a proper design.”
Sensors, Analytics May Increase Efficiency
The first DCOI goal is mostly about putting instrumentation into data centers, which is important because some agencies may not know much energy their data centers are using, Heiter says.
“I love the fact that the new DCOI initiative is embracing that, even for older data centers, because it would be forcing us to do the benchmarking,” he says. “You need the data to be able to make those decisions.”
As agencies start benchmarking their data centers they will discover where “hotspots” are — and which servers are generating the most heat. By using connected Internet of Things sensors and taking measurements of power supplies and other hardware, agencies will learn which servers and other pieces of equipment need more cooling than others.
Heiter says agencies can work with solutions from Microsoft, VMWare and KVM to betterunderstand where requests for virtual machines or servers are headed, so that those areas of the data center can be cooled proactively and over time. Yet he says that will require work from many software, virtualization and storage vendors to become a reality. “If we all start campaigning for this, maybe it might actually happen,” he says.
All of those sensors will generate large amounts of data that can be analyzed and used to inform decisions on things like power and cooling, Heiter says. However, since most of that data will be unstructured, agencies need to put it in the right places and take advantage of analytics tools to make sense of it. “We don’t know in the beginning what is going to be important or not. A lot of learning will go into creating an adaptive set of analytics that can help us make better and better decisions as time goes on,” he says.
Virtualization of More Than Just Servers
Virtualization will play “a very strong role” in helping agencies achieve the DCOI’s goals, Heiter says, since it directs agencies to “the use of virtualization to enable pooling of storage, network and computer resources, and dynamic allocation on demand.”
That will push agencies to embrace software-defined data centers and networking so that they can have more control of networks, storage and servers. Additionally, agencies will be able to have multiple missions and programs working on the same data center systems without having to duplicate that infrastructure in each data center. “That’s where I think DCOI is going to help, because virtualization will take care of much of the heavy lifting for the various programs and agencies,” Heiter says.
As agencies work to optimize their data centers, Hieter says that agencies may determine that having four virtual servers for every physical server is not practical in every location, and that it might be “astronomically expensive” to do so.
“We won’t know until we actually do this,” he says, adding, “It has to come down to a cost-benefit analysis.”
This content is made possible by FedTech. The editorial staff of Nextgov was not involved in its preparation.