An innovative take on data management could result in better access to higher-quality data.
Rick Caccia is vice president of strategy at Delphix.
In 2009, the federal government produced 848 petabytes of data. Putting this into perspective, that is almost 1 million movies worth of data – and that was five years ago. With data growth trends on an exponential upswing, experts at IDC estimate America will have created, replicated and consumed 6.6 zettabytes by 2020. This growth of data is proving beneficial in a number of ways, from data analytics to using it to build better mission-critical applications. But it also comes with its fair share of challenges.
Driven by initiatives like the Federal Data Center Consolidation Initiative, Federal Information Technology Acquisition Reform Act and the Digital Accountability and Transparency Act, agencies are urged to modernize, migrate and consolidate their data. This process, however, often means redundant infrastructures, slow processes and high-risk profiles. By managing data in a new, innovative and agile way, agencies can avoid the typical obstacles and enable fast, flexible and efficient access to higher-quality data in half the time.
Virtual Data Management
In a nutshell, virtual data management is the process of virtualizing an entire application stack – including binaries, files and databases – to quickly deliver full environments for development, testing and reporting needs. To do this, data is virtualized inside databases, data warehouses, applications and files. When additional copies of applications are made, each copy shares data blocks to make it appear and behave like its own separate environment. But because these data blocks are shared, each environment takes up minimal space and can be created in minutes. There are three major benefits users see from this approach:
Accelerated Development: As test engineers are able to access test environments and restore applications in a matter of minutes, projects can be completed in half the time.
Reduced Operating Costs: Agile data management eliminates nearly 90 percent of the data infrastructure costs and delays associated with database-driven applications.
Improved Data Compliance: With the ability to mask data once and make it available for all systems, agencies can ensure adherence to strict compliance standards and avoid releasing personal information to third parties and nonproduction or testing systems.
As noted, virtual data management has the ability to help agencies meet a number of pressing federal mandates. Here is a look at how it can help some of the more prevalent federal drivers:
Driven by initiatives like the DATA Act, agencies are encouraged to grow more transparent and roll out mission-critical and citizen-facing apps, but time and budget often stand in the way. With a virtual, agile approach, however, agencies can speed up the delivery of these apps by creating several copies of data to conduct parallel testing and development – increasing project output by over 500 percent.
Data Center Consolidation
As FDCCI deadlines look to close 1,200 data centers by 2015, agencies need to consolidate and modernize their data fast. Some of the biggest obstacles to achieving consolidation are risk of production system downtime, lost data and lack of resources. By virtualizing stacks of data and having the ability to rollback/forward to any time, agencies can instantly access all of their critical data in preserved states, ensure stability of production systems and move applications with little storage overhead and resources.
By Dec. 31, 2015, agencies need to move to the cloud, but complexity and risk concerns are stalling progress. By virtualizing data and application stacks, agencies can make exact copies of data and applications in minutes and move the data and apps to the new environment while maintaining the integrity of the data, greatly minimizing the risks. This can be done for private or public clouds, including AWS, among others. This helps agencies simplify and automate the migration of their data from legacy systems to the cloud, resulting in a more successful transition.
A proposed $2 billion, or more, decrease in technology spending for the FY15 budget means agencies need to maximize their efficiency to meet pressing deadlines. With key federal drivers pressuring agencies to develop applications, consolidate data centers, migrate to the cloud and more, new approaches need to be leveraged to maximize these efforts – an agile data management approach is the solution many agencies can leverage for success.
NEXT STORY Keeping Pace With Big Data: Livestream