Presented by FedTech
Agencies turn to technology advances and more efficient infrastructure.
In data centers, like any piece of real estate, every square foot matters.
“Any way we can consolidate, save space and save electricity, it’s a plus,” says the State Department’s Mark Benjapathmongkol, a division chief of the agency’s Enterprise Server Operation Centers.
In searching out those advantages, the State Department has begun investing in solid-state drives (SSDs), which provide improved performance while occupying substantially less space in data centers.
In one case, IT leaders replaced a disk storage system with SSDs and gained almost three racks worth of space, Benjapathmongkol says. Because SSDs are smaller and denser than hard disk drives (HDDs), IT staff don’t need to deploy extra hardware to meet speed requirements, resulting in massive space and energy savings.
Such transitions are becoming commonplace. Improving storage management is a pillar of the government’s effort to optimize data centers. To meet requirements from the Federal Information Technology Acquisition Reform Act (FITARA), the Data Center Optimization Initiative requires agencies transition to cost-effective infrastructure.
While agencies are following different paths, the result is nearly identical: simpler and more efficient storage management, consolidation, increased reliability, improved service and cost savings. The U.S. Agency for International Development, for example, has committed to cloud storage.
“Our customers have different needs. The cloud allows us to focus on categorizing our data based on those needs like fast response times, reliability, availability and security,” says Lon Gowen, USAID’s chief strategist and special advisor to the CIO. “We find the service providers that meet those category requirements, and then we let the service providers focus on the details of the technology.”
Options for Simplifying Storage Management
Agencies can choose from multiple technology options to more effectively and efficiently manage their storage, says Greg Schulz, founder of independent analyst firm Server StorageIO. These options include: SSDs and cloud storage; storage features such as deduplication and compression, which eliminate redundancies and store data using less storage; and thin provisioning, which better utilizes available space, Schulz says.
Consider the Defense Information Systems Agency. During the past year, the combat support agency has modernized its storage environment by investing in SSDs. Across DISA’s nine data centers, about 80 percent of information is stored on SSD arrays and 20 percent is running on HDDs, says Ryan Ashley, DISA’s chief of storage.
SSDs have allowed the agency to replace every four 42U racks with a single 42U rack, resulting in 75 percent savings in floor space as well as reduced power and cooling costs, he says.
Deduplication Creates Efficiencies
Besides space savings and the fact that SSDs are faster than HDDs, SSDs bring additional storage efficiencies. This includes new management software that automates tasks, such as the provisioning of storage when new servers and applications are installed, Ashley says.
The management software also allows DISA to centrally manage storage across every data center. In the past, the agency used between four to eight instances of management software in individual data centers.
“It streamlines and simplifies management,” Ashley says. Automatic provisioning reduces human error and ensures the agency follows best practices, while central management eliminates the need for the storage team to switch from tool to tool, he says.
DISA also has deployed deduplication techniques to eliminate storing redundant copies of data. IT leaders recently upgraded the agency’s backup technology from a tape system to a disk-based virtual tape library. This type of approach can accelerate backup and recovery and reduce the amount of hardware needed for storage.
It also can lead to significant savings because DISA keeps backups for several weeks, meaning it often owns multiple copies of the same data. But thanks to deduplication efforts, the agency can store more than 140 petabytes of backup data with 14PB of hardware.
“It was a huge amount of floor space that we opened up by removing thousands of tapes,” says Jonathan Kuharske, DISA’s deputy of computing ecosystem.
Tiered Storage Optimizes Performance and Savings
The State Department, which began investing in SSDs three years ago, hopes to move exclusively to flash storage in its four core data centers as its HDDs reach their end of life, SSD prices continue to drop and the agency budget permits, Benjapathmongkol says.
Today, the State Department stores 30 percent of its data on SSD and the remaining information on spinning disks. By September, IT leaders expect SSDs to account for about half of the agency’s storage. The State Department is moving toward SSDs because of features such as built-in data-at-rest encryption, which boosts security, he says.
To optimize performance, the State Department segments its storage across three tiers of hardware: SSDs, SAS drives and slower, lower-cost nearline SAS drives.
The agency places frequently accessed data and information with high input/output per second (IOPs) requirements on SSDs. Data that staff access infrequently or that doesn’t require higher speeds is housed on spinning disks, Benjapathmongkol says.
To help with this task, the IT staff uses EMC fully automated storage tiering (FAST) software to reprioritize storage needs each day. The software monitors data usage and moves information to the appropriate tier based on how active or inactive the data is, Benjapathmongkol says.
In some cases, IT staff places data on a specific tier for performance reasons, where it stays unless moved.
“It is done manually if we need to maintain a constant IOPS,” Benjapathmongkol says. “But if there’s not a hard IOPS required, we automate to give the app the best performance.”
DISA manages its storage tiers in a similar fashion with a mix of manual and automated tiering. For its virtualized operating environments, the agency turns to automated tiering. When mission partners choose an alternative storage, however, data might be placed manually, Ashley says.
Historically, IT departments have used different tiers of storage hardware (depending on price and performance needs) to reduce overall costs and improve efficiency. At DISA, some mission partners are using slower, less expensive, disk-based storage for archive data, Ashley says.
Categorize Data to Go Cloud First
To comply with the government’s “Cloud First” edict, USAID began migrating to cloud services, including infrastructure and software services, about seven years ago.
Previously, USAID managed its own data centers and tiered its storage. But the agency moved its data to cloud storage three years ago, Gowen says, allowing USAID to provide reliable, cost-effective IT services to its 12,000 employees across the world. The agency, which declined to offer specific return on investment data, currently uses a dozen cloud providers.
“We carefully categorize our data and find service providers that can meet those categories,” says Gowen, noting categories include availability and security. “They just take care of things at an affordable cost.”
For its public-facing websites, the agency uses a cloud provider that has a content distribution network and can scale to handle sudden spikes in traffic.
In late 2013, a typhoon lashed the Philippines, killing at least 10,000 people. In the days following the disaster, President Obama announced USAID sent supplies including food and emergency shelter. Because the president mentioned USAID, about 40 million people visited the agency’s website. If USAID had hosted its own site, it would have crashed. But the cloud service provider handled the traffic, Gowen says.
“Our service provider can scale instantaneously to 40 million users, and when visitors drop off, we scale back,” he says. “It’s all handled.”
This content is made possible by FedTech. The editorial staff of Nextgov was not involved in its preparation.