recommended reading

Commentary: Get your priorities straight before consolidating data centers

In the current economic climate, it seems everyone is being asked to do more with less. Nowhere is this truer than in the public sector, where budgets continue to be slashed. Although state and local government agencies aren’t subject to the federal data center consolidation initiative, many are expected to carry out similar consolidation projects in an effort to reduce waste.

The problem with such initiatives is that they focus on cost-cutting with little regard for how (or how well) IT is able to do its job, which is to provide applications and services that deliver tangible benefits to individuals and the organization as a whole.

Many IT departments have spent years building multiple data centers for the very purpose of improving the delivery of applications and services to users, and providing for disaster recovery. Shutting down secondary data centers and centralizing applications can present significant challenges for IT staff. While there are substantial benefits to centralizing and virtualizing servers, applications and storage, hastily consolidating centers without the proper focus can actually backfire, costing organizations far more money than they expect to save.

What’s more, applications can take a significant performance hit when they are moved to new, virtualized environments. This can lead to a stream of remedial efforts—deploying specialized solutions and appliances; custom-coding applications; endlessly tweaking the network—all in hopes of improving performance. One-off solutions deployed to fix these unanticipated problems only introduce more complexity in the infrastructure. Ultimately, agencies can end up with kludgy solutions that are difficult and expensive to manage and maintain.

If that’s not enough, data center consolidation often puts applications and data at risk because security can become a trade-off for performance and availability. As startling as that seems, it often comes down to IT officials making a pragmatic choice because the cost of non-compliance with security mandates is sometimes less severe than failure to deliver applications.

So, what are agencies to do? How do IT officials comply with such mandates without compromising security and the quality of the services they deliver?

Rethinking the Approach

Although the data center consolidation initiative highlights cost savings as a key benefit, IT departments can’t afford to take a myopic, purely cost-driven approach. And while best-practices advice is available from many sources, much of it focuses on issues such as accurately estimating time and skills required, ensuring open communication, obtaining expert assistance, implementing rigorous testing and defining criteria for success.

These are all critical factors to consider, of course. But managers must first step back and take a holistic view that focuses on the desired outcome—to successfully (and safely) connect users to the mission-critical applications they need to do their jobs and ensure constituents have ready access to the information they need. That goal must remain constant, no matter the size or scale of the consolidation. After all, what value is there in saving money through consolidation if, in the end, the IT staff isn’t able to meet the needs of its customers?

Since the network is the mechanism by which applications are delivered to users, it’s critical that IT departments make application delivery networking (ADN) a central consideration in their consolidation plans. If managers hope to fully realize the benefits of virtualized, consolidated environments, this refocused approach is essential.

ADN is a vendor-neutral approach to computing that encompasses application acceleration and availability as well as network and application security. The goal of ADN is to ensure that applications are always fast, secure, and available, regardless of where they’re hosted or the type of device on which they’re delivered. ADN solutions are implemented at the “middle layer” between the physical network (switches) and the applications themselves. This gives the IT staff a great deal of flexibility to deliver reusable, integrated services for resources, applications, and clients.

At a minimum, an ADN solution includes an application delivery controller that uses various technologies or optimization techniques (such as proxying, caching, SSL offloading, Transmission Control Protocol optimization, rate shaping) to intelligently manage and route traffic to the best performing server, either in a single data center or across multiple data centers. Application delivery controllers are often used in conjunction with wide-area network optimization solutions in branch offices or secondary data centers to enhance network performance and reduce the amount of data transferred across the network.

As the number of mobile users and devices grows, and cyber attacks become more sophisticated and frequent, it’s critical that IT departments provide adequate protection for their networks, applications, data and users. Today, many ADN solutions now offer integrated Web application firewall capabilities and access policy management. It’s far more efficient and cost effective to centrally manage security policies than to spend resources and effort individually securing each potential vulnerability.    

Many ADN solutions incorporate advanced intelligence that give IT staff much-needed visibility into all client-application interactions. With the proper tools to understand the context of those interactions, staff can intelligently control the flow of network traffic, eliminating bottlenecks and optimizing application performance, whether across one, two, or a dozen data centers. The most comprehensive ADN solutions are available and deployable through a unified technology platform. This gives IT managers the architectural building blocks to create a simplified infrastructure that not only performs services on behalf of the applications themselves, but also supports and leverages the full benefits of data center virtualization.

And that brings us back to the original goal, which is to enable the IT shop to do its job—providing applications and services that deliver tangible benefits to individuals and the organization. By stepping back from the purely cost-centered approach and realizing the importance that ADN plays in their overall success, IT departments can create better, more successful consolidation plans—plans that not only deliver end-to-end security, performance, and availability in the most operationally efficient way, but are far more likely to achieve cost-cutting goals as well.

Nick Urick is regional vice president for F5 Networks.

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Modernizing IT for Mission Success

    Surveying Federal and Defense Leaders on Priorities and Challenges at the Tactical Edge

  • Communicating Innovation in Federal Government

    Federal Government spending on ‘obsolete technology’ continues to increase. Supporting the twin pillars of improved digital service delivery for citizens on the one hand, and the increasingly optimized and flexible working practices for federal employees on the other, are neither easy nor inexpensive tasks. This whitepaper explores how federal agencies can leverage the value of existing agency technology assets while offering IT leaders the ability to implement the kind of employee productivity, citizen service improvements and security demanded by federal oversight.

  • Effective Ransomware Response

    This whitepaper provides an overview and understanding of ransomware and how to successfully combat it.

  • Forecasting Cloud's Future

    Conversations with Federal, State, and Local Technology Leaders on Cloud-Driven Digital Transformation

  • IT Transformation Trends: Flash Storage as a Strategic IT Asset

    MIT Technology Review: Flash Storage As a Strategic IT Asset For the first time in decades, IT leaders now consider all-flash storage as a strategic IT asset. IT has become a new operating model that enables self-service with high performance, density and resiliency. It also offers the self-service agility of the public cloud combined with the security, performance, and cost-effectiveness of a private cloud. Download this MIT Technology Review paper to learn more about how all-flash storage is transforming the data center.


When you download a report, your information may be shared with the underwriters of that document.