Building on old foundations doesn't make sense.
Already this year, we have gained much clarity regarding people and policy on the federal IT scene. We’ve seen the appointment of a new federal CIO, more certainty around the federal budget and the kickoff for Modernizing Government Technology Act working capital funds. These changes present a great opportunity for government IT modernization, but it’s important that we don’t lose sight of critical basics in the rush to bring the federal government up to speed.
The focus on innovation and jump-starting federal IT modernization is no surprise; government IT leaders have felt this need for years. Technology is progressing at amazing speed—just look at cloud solutions, artificial intelligence, machine learning, the internet of things and collaboration technologies. The essential building blocks exist for creating a smarter, more efficient, and more effective government. The federal government absolutely should take advantage of these innovations.
But while this shift is happening, keep in mind that we cannot build the technology of the future on the foundations of the past.
With the introduction of MGT, there is a push to jump-start IT projects that promise significant savings, but an important qualification is that agencies are on the hook to pay back those funds, whether or not their projects generate forecasted benefits. That is a tricky boat for federal agencies to be in, and if the vital, foundational building blocks are not in place, agencies could find their projects under water.
Data is the new currency for 21st-century government, and modern tools such as AI and IoT rely on instantaneous access to large and secure, high-quality data sets. Within the last two years, the amount of compute required to run bleeding-edge, deep-learning algorithms has jumped 15-fold, and the compute delivered by GPUs has jumped 10-fold, while legacy data storage has remained stagnant and slow. Legacy systems were designed in an era with an entirely different set of requirements for speed, capacity and density; they will be a bottleneck for modern systems and applications.
We need to begin building the foundation of a modernized government today by embracing a data-centric architecture that is consolidated and simplified; real-time, on-demand and self-driving; and able to support the multi-cloud reality. This requires reimagining data platforms from the ground up. Slow data platforms mean slow machine-learning performance, prohibiting AI advancements and the imprisonment of available insights within mountains of untapped data.
Several key characteristics define data platform and storage requirements for the cloud era:
- A highly-parallel application architecture that can support 1,000s to 10,000s of composite applications sharing petabytes of data versus 10s to 100s of monolithic applications consuming terabytes of data siloed to each application.
- Elastic scale to petabytes that allow organizations to pay as they grow with perpetual forward compatibility.
- Full automation to minimize management resources required to maintain the platform.
- The ability to support and span multiple cloud environments from core data centers to edge data centers, as well as across multi-cloud infrastructure-as-a-service and software-as-a-service providers.
- An open development platform versus a closed ecosystem built on complex one-off storage software solutions.
- A subscription-consumption model that supports constant innovation and eliminates the churn and endless race to expand storage to meet growing needs and refresh every three to five years.
Progress on modernization can only move as fast as its slowest technology allows. Without modern data platforms footed in a data-centric architecture, even the most sophisticated and innovative modernization projects may bear no fruit, costing agencies more than they gain and, worse yet, stunting their forward progress.
Nick Psaki is a principal in PureStorage’s Office of the Chief Technology Officer.