There is little room for live trial and error when it comes to the management of data centers.
Technology, and the infrastructure that underpins it, is going through a period of rapid flux and federal data centers, many dealing with archaic architecture and legacy systems, are scrambling to keep up. As these trends and others emerge, agencies and organizations need to adapt infrastructure to take advantage of the business benefits—often through consolidating existing facilities.
To help federal government agencies focus their resources to bring data centers up to modern standards of performance, security, and infrastructure, the Office of Management and Budget issued the Data Center Optimization Initiative. According to recent news, only about half agencies are on target to meet the original deadline of the end of fiscal 2018. Agencies not already on track to meet the extended deadline of 2020 need to start now to optimize their data centers to meet this mandate.
Given the complexity of this process and the timeline to deliver improved performance, how can federal agencies that own and manage their own infrastructure maintain a level of data center performance that is aligned with business goals during consolidation? The answer lies in simulation.
It is commonly understood that data centers live a transitory life of replacements and upgrades, and before too long they become unrecognizable from their original form. It is the job of data center managers to optimize the performance of these assets throughout every stage of their lifecycle. And there are a few vital steps and processes to both increase their site’s longevity and optimize the data center’s ability to perform mission-critical tasks.
One of the most common techniques used to achieve this is consolidation, which is the process of amalgamating multiple data center locations under one roof. This then enables organizations to revamp their hardware platforms with the view to condense and optimize those containing vital software. When OMB issued the DCOI, there were over 11,000 federal facilities labeled as data centers. The initiative requires the closure of at least 25 percent of the tiered data centers and 60 percent of the non-tiered data centers. As with organizations everywhere, federal agencies are increasingly investing in big data, expanding the need for collection and storage capabilities, and capacity to process and analyze the data to derive actionable insights.
With this push and pull between legacy systems and the need to consolidate and modernize data centers, it is even more vital that federal data center operators continue to maintain operational efficiency during these periods of transition. Equally, with so much at stake, the importance of these projects running both on time and within budget cannot be understated. The good news is that, by factoring in the right considerations and using the latest innovations in the field of simulation, this can be avoided.
The Art of Consolidation
In addition to compliance with DCOI, the benefits of undergoing the process of data center consolidation are ample. Not only can agencies see significant savings on the operational and logistical side, but there are also huge knock-on advantages to the security compliance and energy efficiency side of data center management. Moreover, with the recent innovations and improvements in the world of connectivity, many agencies are looking to migrate their IT estate into shared services with other agencies, and cloud or colocation providers. Consolidation represents an attractive way to reduce local data storage to compliance-critical data.
Naturally, this is a highly nuanced and complicated process that raises a multitude of issues and questions to be considered. While the list of important factors can seem endless, the most frequently experienced involve understanding which sites to close, which items of hardware can be retired and which are needed to remain operational. It can often feel like using pieces from three different jigsaw puzzles to create one new layout with a common infrastructure.
When it comes to consolidation, there is frequently a divide between expectation and reality, with agency leaders often quick to request tight turnaround times and keep overall disruption to IT performance to a minimum. However, the time pressures imposed by these projects can lead to the capacity planning process being cut short, which in turn limits the effectiveness and overall performance of the finished project.
There is little room for live trial and error when it comes to the management of data centers, due to the risk of failure to mission-critical processes. This is particularly true when undertaking a consolidation project. With multiple moving parts and huge expenditure on the line, operators are under an immense amount of pressure to improve overall efficiency and meet their key performance indicators. This means that agencies must get it right the first time.
It is through this difficult task that the benefits of engineering simulation come into play. Industries across the engineering space regularly utilize simulation to improve performance and meet business goals, and this should be no different when planning consolidation projects.
For example, engineers faced a similar problem when developing smartphones—attempting to consolidate your phone, camera, credit card and web browser simultaneously into a single device. These technicians wouldn’t have been able to amalgamate these functionalities into a singular device without extensive simulation and testing prior to even building the initial prototype. In being bold and ambitious when testing the functionality of a smartphone, using this simulation technology has driven the entire mobile device industry forward. Similar benefits can be seen when implementing change throughout the lifecycle of a data center.
Through the use of the virtual facility, a digital clone of the actual data center, data center managers safely experiment and predict the impact of potential changes to the data center environment—without impacting performance the performance of the actual data center. The latest innovations in this field incorporate 3D modeling, power system simulation and computational fluid dynamics to develop a holistic overview of data center performance.
Simulation has become a pivotal tool in helping organizations execute business-defining transitions with minimal risk of failure. Through effective utilization of this technology, federal data center managers are able optimize the delivery of consolidation projects, with reduced risk and within a timescale to meet their mandated deadline. It’s past time for federal agencies to get serious about leveraging advanced technology to create more efficient and agile data centers.
Akhil Docca is the director of Marketing at Future Facilities.