Cryptographic debt and quantum readiness [Q&A]


As White House Executive Orders, NIST mandates, and international deadlines accelerate the push toward post-quantum encryption, the clock is ticking for organizations still grappling with cryptographic debt.
We spoke to Dave Krauthamer, co-founder and field CTO at QuSecure, to learn more about emerging threats, compliance mandates, and mitigation frameworks for organizations looking to get ahead of the coming disruption.
BN: What is cryptographic dept?
DK: For more than two decades, organizations have woven cryptographic functions directly into the fabric of their applications, believing that encryption would safeguard data. This includes everything from encrypted private data, company secrets, even email.
Most organizations operate dozens or even hundreds of applications, each bearing its own set of encryption libraries, certificate authorities, and key management routines. When a vendor goes out of business or deprioritizes updates, that application is left with stale and defeatable cryptography. This out-of-date cryptography, or cryptographic debt, can persist for years or even decades and often takes at least as long to remediate to current versions.
This approach has created a hidden liability, known as cryptographic debt, with risks now increasing as breakthroughs in quantum computing and artificial intelligence threaten to render existing encryption protections obsolete.
BN: What is the specific threat to organizations?
DK: This long-used embedded encryption, combined with the rapid pace of algorithmic advancements, poses a systemic risk to enterprise security and continuity. Advances in quantum hardware, demonstrated by research labs achieving increasingly stable qubit coherence, threaten to collapse organizations’ reservoirs of cryptographic debt. Some of the most widely deployed public-key algorithms rely on hardness assumptions that quantum algorithms can eventually invalidate.
BN: Why is this important even with quantum-ready computers not expected for years?
DK: Fully fault-tolerant quantum machines lie several years in the future, but the ‘harvest now, decrypt later’ strategy of adversaries collecting encrypted traffic today with the expectation of breaking it soon is happening now. Sensitive intellectual property, customer records, and strategic communications locked in archival encrypted databases could be vulnerable long before enterprises recognize the scope of their exposure.
BN: How does the emergence of AI increase the risk?
DK: Artificial intelligence exacerbates the problem for this fragile encryption ecosystem from multiple angles. Machine learning models have demonstrated the ability to identify subtle implementation flaws in a host of cryptographic implementations. AI-assisted side channels, man-in-the-middle attacks, padding oracles, and poor randomness sources are risks within legacy code.
In addition, automated tools powered by AI can execute large-scale vulnerability scans across an organization’s entire application portfolio, pinpointing weak cipher suites or misconfigured key exchanges that might go unnoticed by human inspectors conducting comparatively limited-scope vulnerability analyses. Worse than that, generative AI systems can craft novel attack vectors by stitching together fragments of known exploits to defeat obsolete cryptographic protections. As AI-driven red teams increasingly outpace traditional vulnerability assessments, companies that lack a coordinated strategy for managing their cryptographic assets remain blind to the timing and nature of emerging threats.
BN: How can organizations protect themselves from this threat?
DK: To address the convergence of quantum and AI threats, enterprises must centralize encryption services through enterprise key-management platforms or cloud-native cryptography orchestration platforms. This lets organizations deliver standardized algorithms to any endpoint without requiring the embedding of vendor-specific libraries. This abstraction not only streamlines routine key rotations and certificate renewals but also positions organizations to pivot quickly when quantum-resistant algorithms become commercially viable. When the time comes to transition from RSA-2048 to a quantum-safe scheme, the change happens in a single control plane rather than across thousands of codebases.
In addition, organizations must continuously monitor their cryptographic assets for weaknesses. Without a comprehensive inventory of certificates, private keys, and cipher algorithms in use, executives are unaware of which applications rely on obsolete or unsupported encryption. Creating a ‘cryptographic bill of materials’ catalogs every dependency, listing algorithm versions, key lengths, and certificate authorities so that security teams can identify imminent expirations, weak encryption suites, or vendors that no longer maintain their cryptographic modules. Real-time scanning tools, augmented by AI-driven analysis, can flag anomalies such as bump-down attempts or unpermitted workload connections. This visibility enables informed prioritization of remediation efforts of cryptographic debt before an exploit can take place.
BN: What is 'crypto agility' and how can it help?
DK: Crypto agility is the ability of an organization to efficiently and rapidly change cryptographic algorithms, protocols or primitives in response to emerging threats, vulnerabilities or regulatory requirements. In a crypto-agile environment, systems are designed with modular cryptographic components that can be swapped out or upgraded without major architectural changes. This design allows for the swift implementation of new encryption algorithms or protocols when older ones become insecure or obsolete. Crypto agility is extremely important when the encryption algorithm of a system is discovered to be vulnerable and can help enterprises ensure long-term data protection and maintain compliance with shifting security frameworks.
When it comes to cryptographic debt, organizations must adopt crypto agility to automate remediation when vulnerabilities emerge. In an agile framework, algorithm updates and key rotations become orchestrated events, pushed out to all dependent services with minimal human intervention. When a novel attack on a classical cipher is disclosed, or a prototype quantum decryptor crosses a performance threshold, the central cryptographic service can automatically switch to an alternative algorithm, reissue certificates, and retire compromised key pairs. This level of agility demands that every application trusts the centralized service for cryptographic operations rather than hard-coding specific libraries. While the transition requires investment in infrastructure and process changes, the alternative is untenable -- a manual, application-by-application overhaul that drags on as quantum and AI advances threaten an organization’s protected data.
Image credit: ArtemisDiana/depositphotos.com