Deft consolidation

Agencies turn to server virtualization to reverse the risk and liability of costly computer sprawl.

The Food and Drug Administration kicked off its server virtualization project for the same reason many government agencies do. It wanted to consolidate dozens of Microsoft Windows servers, in this case, as part of its move to a new state-of-the-art data center at the White Oak, Md., campus in the Washington, D.C. metro area.Virtualization divides a physical server into isolated virtual environments, so organizations can run multiple applications or operating systems without worrying that a crash on one virtual machine (VM) will affect another or that data will be able to migrate between VMs and applications. But Linda Burek, the FDA’s chief information officer, and her team saw more possibilities in the technology, which is now playing a leading role in the agency’s Information Technology Transformation Program to move to a modern, utility-based operating environment.  “We need to reduce the cost and human resources required to maintain the infrastructure and free up some resources to put on more strategic types of initiatives,” Burek said. She expects demands on the IT organization by the FDA’s reviewer community to increase in an emerging world of high-end, computer-intensive developments in areas such as gene therapy. The organization, which worked with Booz Allen Hamilton on developing its server consolidation solution, will soon implement Phase 4 of its virtualization project. That will include upgrading all hosts from VMware ESX Server Version 2.5.3 to the vendor’s Virtual Infrastructure 3 product, which includes self-healing capabilities such as VM load-balancing and failover. “We’re striving to move toward an environment that is more automatically run by itself,” Burek said. Virtualization isn’t mainstream yet, but some government agencies are buying into the technology. At the FDA, for instance, six VMware ESX hosts are running more than 80 VMs. If the agency receives funding to complete the infrastructure portion of its IT Transformation Program, it will virtualize more than 400 Web, application, file, print, database, e-mail and legacy domain controller servers by the end of fiscal 2009. Info-Tech Research Group reports that about 56 percent of government departments or agencies with 100 or more employees are using, testing or planning to use the technology, and 16 percent of that group is already in a production environment. In the Windows and Linux space, major virtualization vendors include market leader VMware, Microsoft, XenSource and SWsoft. Those vendors take different approaches to virtualization, but they all agree that facilities consolidation, budget constraints, disaster recovery requirements, power concerns, testing and IT transformation projects are catalysts for server virtualization efforts in government agencies. “Virtualization reduces the risks of addressing these issues,” said Aileen Black, vice president of federal operations at VMware.Applications are packaged into VM files, so users can rapidly save, copy, provision or move them. That flexibility has attracted the attention of officials at the National Operations Group, the arm of the Federal Aviation Administration that operates the National Operations Coordination Center. That center tracks and reports National Airspace System equipment-related events. The center uses SWsoft’s Virtuozzo to build standard configurations and deploy provisioned servers in minutes, said Mariano Pernigotti, a support contractor and senior systems engineer at engineering services firm CSSI. Pernigotti, who works on-site at the FAA Command Center, leads a team responsible for the development and deployment of an integrated set of Web-based applications. The team delivered the applications through two Web portals that facilitate the tracking, storing and reporting of the equipment-related event data. The two portals consist of multiple servers that support Web, mail, FTP, calendar, charting and reporting functions.“We’re a small group of six, so we wear multiple hats, and the one hat we don’t have time to wear anymore is the provisioning role,” Pernigotti said. With a dedicated virtual testing environment, the team can isolate applications  in a known configuration and state to ensure they work correctly before deployments. Server virtualization will ultimately allow center officials to reduce the number of servers from 14 to six — three host nodes and three database servers. At the same time, the group is also changing the infrastructure operating paradigm in favor of greater application isolation, which is important for stability.  Virtualization also offers government agencies the opportunity to be more efficient and responsive to new business requirements. “As much as 25 [percent] to 30 percent of the cycle of developing and testing a new application is often taken up with hardware issues,” said John Sloan, an Info-Tech senior research analyst. “That goes away with virtualization.” Organizations can keep an archive of server VM images, such as templates, and if they understand an application’s requirements for processing power and storage, they could create a VM in hours. Organizations won’t realize virtualization’s potential without careful planning. It’s probably unrealistic to expect to cram as many VMs onto a physical server as vendors tout, Sloan said. Figuring out how many servers will be necessary in a virtualized environment requires first measuring the total resource utilization of your existing infrastructure, said Jim Sweeney, solutions architect at GTSI. “You have to know the resources you are using right now,” he said. “If you don’t know those answers, you will run into problems when you try [to] virtualize them.” Users should test input/output-intensive applications carefully before moving them into a production environment, and organizations with steep data requirements also should consider high throughput storage-area networks. “The last thing you need is a bunch of virtualized machines not properly utilizing CPU and memory while they’re waiting for data,” he said.Matthew Lutz, a CACI contractor who serves as chief architect at the Office of the Secretary of Defense, considers planning to be an important part of a three-year, five-phase IT transformation project to reduce application and resource duplication across some 1,400 servers and 14 organizations. The transformation project involves mirroring and migrating user and group accounts from various domains into a single enterprise environment based on Microsoft Exchange 2003 and Active Directory. Project developers will use a migration tool from Quest Software, deliver common infrastructure services and create a single infrastructure support team. CACI accelerated some server consolidation phases of the project because of the Pentagon’s renovation. Lutz’s team had to map the applications of hundreds of affected servers and concluded that 85 percent to 90 percent were candidates for virtualization using VMware ESX Server. In doing so, the team sought to uphold the IT version of the Hippocratic oath: First, do no harm.“We started doing baseline performance over time to make sure we had a track record of the legacy infrastructure, so when we moved into a virtualized environment we could do a comparative analysis of processor and memory utilization and make sure we were not impacting the performance of the application,” Lutz said. It is equally important to have a mitigation plan if things don’t work out, he said. VMware ESX’s Vmotion feature provides a way to move servers to use resources on other platforms if an application consumes too many resources. Lutz said he can also can revert to a dedicated hardware platform using a virtual-to-physical migration utility in VMWare.





































Fast action




























Jennifer Zaino has been covering business-technology issues for the industry’s leading publications since 1986. You may write to her at jennyzaino@hotmail.com.
Steer clear of these virtualization pitfallsVirtualization has the potential to improve an agency’s infrastructure, but be prepared for user and information technology employee resistance, software licensing hiccups, and other challenges.

One potential issue, said John Sloan, an Info-Tech senior research analyst, is that virtualization takes ownership of a physical server away from its administrator, who will need assurance that the requirements for processing power and disk space will be maintained via service-level agreements. A concern for IT departments is the added complexity that comes with accounting for virtualized server resources and charging back for provisioning of virtual servers, Sloan said.

Virtualization can also rankle the hardware-centric employees in the data center. “An IT manager knows what infrastructure looks like — they just walk into the server room,” Sloan said. But virtualization changes that picture.

“There’s no blue cable to follow anymore,” he said. “It’s a psychological change that requires some education.”
The Food and Drug Administration grappled with getting the IT community to accept the paradigm shift that was about to happen, and it needed to train people despite its funding constraints. Educating all the stakeholders in the IT community via a road show that included a live demonstration was critical to moving the project forward, FDA officials said.

Externally, expect to face some trouble with vendors concerning licensing software for use in virtualized environments. In a nonvirtualized environment, vendors often charge customers based on the number of physical servers or processors they use to run the software. There is no agreement yet about a licensing model for virtualization, and some vendors may consider it fair game to require users to pay licensing fees for each virtual server.

But some companies are starting to tailor license agreements for virtualization. Microsoft, for instance, now provides unlimited virtualization rights on a physical machine for buyers of Windows Server 2003 R2 Datacenter Edition, and four additional use rights to Windows for Windows 2003 R2 Enterprise Edition. The vendor will follow that model with Vista, its next release of Windows.

The FDA has experienced some resistance from application vendors on certifying their software on VMware, said Lowell Marshall, the virtualization project manager at the agency.

“It’s the newness,” Marshall said, as vendors address moving to licensing systems not based on per-processor or per-user models. “But really a software vendor has to get on that bus or miss it. VMware is happening and virtualization is happening.”

— Jennifer Zaino