recommended reading

Making Agile Real: How a Virtual Approach Boosts Federal Efficiency


Rick Caccia is vice president of strategy at Delphix.

In 2009, the federal government produced 848 petabytes of data. Putting this into perspective, that is almost 1 million movies worth of data – and that was five years ago. With data growth trends on an exponential upswing, experts at IDC estimate America will have created, replicated and consumed 6.6 zettabytes by 2020. This growth of data is proving beneficial in a number of ways, from data analytics to using it to build better mission-critical applications. But it also comes with its fair share of challenges.

Driven by initiatives like the Federal Data Center Consolidation Initiative, Federal Information Technology Acquisition Reform Act and the Digital Accountability and Transparency Act, agencies are urged to modernize, migrate and consolidate their data. This process, however, often means redundant infrastructures, slow processes and high-risk profiles. By managing data in a new, innovative and agile way, agencies can avoid the typical obstacles and enable fast, flexible and efficient access to higher-quality data in half the time.

Virtual Data Management

In a nutshell, virtual data management is the process of virtualizing an entire application stack – including binaries, files and databases – to quickly deliver full environments for development, testing and reporting needs. To do this, data is virtualized inside databases, data warehouses, applications and files. When additional copies of applications are made, each copy shares data blocks to make it appear and behave like its own separate environment. But because these data blocks are shared, each environment takes up minimal space and can be created in minutes. There are three major benefits users see from this approach:

Accelerated Development: As test engineers are able to access test environments and restore applications in a matter of minutes, projects can be completed in half the time.

Reduced Operating Costs: Agile data management eliminates nearly 90 percent of the data infrastructure costs and delays associated with database-driven applications.

Improved Data Compliance: With the ability to mask data once and make it available for all systems, agencies can ensure adherence to strict compliance standards and avoid releasing personal information to third parties and nonproduction or testing systems.

Case Examples

As noted, virtual data management has the ability to help agencies meet a number of pressing federal mandates. Here is a look at how it can help some of the more prevalent federal drivers:

Application Development

Driven by initiatives like the DATA Act, agencies are encouraged to grow more transparent and roll out mission-critical and citizen-facing apps, but time and budget often stand in the way. With a virtual, agile approach, however, agencies can speed up the delivery of these apps by creating several copies of data to conduct parallel testing and development – increasing project output by over 500 percent.

Data Center Consolidation

As FDCCI deadlines look to close 1,200 data centers by 2015, agencies need to consolidate and modernize their data fast. Some of the biggest obstacles to achieving consolidation are risk of production system downtime, lost data and lack of resources. By virtualizing stacks of data and having the ability to rollback/forward to any time, agencies can instantly access all of their critical data in preserved states, ensure stability of production systems and move applications with little storage overhead and resources.

Cloud Migration

By Dec. 31, 2015, agencies need to move to the cloud, but complexity and risk concerns are stalling progress. By virtualizing data and application stacks, agencies can make exact copies of data and applications in minutes and move the data and apps to the new environment while maintaining the integrity of the data, greatly minimizing the risks. This can be done for private or public clouds, including AWS, among others. This helps agencies simplify and automate the migration of their data from legacy systems to the cloud, resulting in a more successful transition.

A proposed $2 billion, or more, decrease in technology spending for the FY15 budget means agencies need to maximize their efficiency to meet pressing deadlines. With key federal drivers pressuring agencies to develop applications, consolidate data centers, migrate to the cloud and more, new approaches need to be leveraged to maximize these efforts – an agile data management approach is the solution many agencies can leverage for success.

(Image via Genialbaron/

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.