Building quantum-safe foundations for federal data resilience

Eugene Mymrin/Getty Images
COMMENTARY | A data-first approach can help agencies avoid turning post-quantum cryptography adoption into a multi-year overhaul.
Quantum computing is no longer a distant concept. It is an emerging capability with real implications for national security. The federal government holds some of the most sensitive data in the world, and the question is not if quantum computing will threaten today’s encryption, but when. Data encrypted today could be decrypted tomorrow. This “harvest now, decrypt later” risk has put agencies on the clock.
That urgency is why the conversation around post-quantum cryptography (PQC) has accelerated. The National Institute of Standards and Technology (NIST) finalized its first three PQC standards in August 2024, setting the foundation for how organizations can protect encrypted data against future quantum decryption. But while PQC is essential, it is not a silver bullet. Encryption is the final layer of protection. If it becomes the only thing standing between your data and an adversary, too many other defenses have already failed.
The path to quantum resilience starts deeper, in how data is stored, managed and protected every day.
The hidden challenge: data sprawl and visibility
Across the federal landscape, decades of modernization have created an unintended consequence: data sprawl. Critical information lives in multiple environments, including legacy systems, the cloud and disconnected archives. Before agencies can protect data against quantum threats, they must know where it resides and how it moves.
Artificial intelligence and automation can help agencies gain visibility into their data ecosystems, identify sensitive or at-risk assets and classify them for appropriate protection. Visibility also drives operational readiness by ensuring that when PQC algorithms are implemented, agencies can apply them efficiently and intelligently across the right datasets.
This data-first approach is essential to making sure PQC adoption does not turn into a multi-year overhaul. With the right foundation, agencies can modernize encryption while maintaining mission continuity.
Defense in depth, not in isolation
Modern cybersecurity must assume that any single control can fail. Building true resilience means layering protections, starting with the storage layer. It is the one place every piece of data eventually passes through, and the one layer that can anchor security across hybrid environments.
Defense in depth at the storage level means embedding intelligence where the data actually lives. For example, AI-driven anomaly detection can flag potential data breaches or insider activity in real time. Isolated recovery environments can ensure that even if an attack succeeds, restoration can happen safely without reintroducing compromised data.
These kinds of proactive defenses form the connection between encryption and resilience. They make it possible for agencies to not only anticipate and discover threats early but also respond to them and recover with confidence.
Quantum-ready does not mean quantum-only
While quantum computing represents a major technological shift, its implications mirror challenges we have seen before. When artificial intelligence first emerged, organizations realized that without strong data foundations, even the most advanced models failed to deliver results. The same principle applies here. Quantum security starts with data integrity and visibility.
Preparing for PQC is not about tearing down existing systems. It is about extending what already works. Agencies can align modernization efforts with NIST guidance by focusing on three core priorities:
- Inventory and classify data to identify what needs PQC protection first.
- Adopt flexible, hybrid infrastructure that can apply new cryptographic standards across on-premises and cloud environments.
- Integrate AI-driven defense mechanisms that continuously monitor and learn from evolving threat patterns.
These steps build quantum readiness while strengthening cyber resilience overall, a win that extends beyond encryption.
The broader mission: trust through readiness
Public trust in digital government depends on data protection. Whether managing classified intelligence, citizen records or public health information, agencies cannot rely on point solutions. A resilient data infrastructure enables them to pivot as threats evolve without sacrificing agility, compliance or mission performance.
Quantum-safe modernization is part of a larger journey toward intelligent, adaptive data systems. It is about ensuring that the infrastructure supporting government missions today can stand up to the unknowns of tomorrow.
In the end, the goal is not just to survive the quantum era. It is to be ready for it, with data that can defend itself.
Gagan Gulati is SVP and GM of the Data Services business at NetApp, where he leads the product and engineering teams, and the overall P&L responsibilities across the company’s cyber resilience and AI portfolio to drive NetApp’s vision of an intelligent data infrastructure company.




