The finalization of the NIST post-quantum cryptography standards (FIPS 203, 204, 205) in August 2024, followed by the selection of HQC as a fifth algorithm in March 2025, marks the beginning of a fundamental transformation in the global cryptographic landscape. This article provides a comprehensive practitioner’s analysis of what this transition means for three critical sectors — banking, energy infrastructure, and government systems — drawing on over 19 years of hands-on experience in security auditing, penetration testing, and cryptographic assessment across these domains. The analysis covers the technical implications of replacing RSA, ECDSA, and Diffie-Hellman with lattice-based and hash-based alternatives, the practical challenges of migrating production systems, the evolving European regulatory framework, and a detailed set of recommendations for organizations at different stages of PQC readiness. The article aims to bridge the gap between academic cryptographic research and the operational reality of securing complex, heterogeneous IT and OT environments.
Asymmetric cryptography — the mathematical foundation of digital trust — is under existential threat. The algorithms that enable secure web browsing, financial transactions, digital signatures, VPN tunnels, and certificate-based authentication all rely on the computational difficulty of two closely related mathematical problems: integer factorization (the basis of RSA) and the discrete logarithm problem (the basis of Diffie-Hellman, DSA, ECDSA, and ECDHE). Peter Shor’s quantum algorithm, published in 1994, demonstrated that both problems can be solved in polynomial time by a sufficiently powerful quantum computer, rendering the entirety of deployed asymmetric cryptography vulnerable.
For over three decades, this threat remained theoretical. Quantum computers capable of breaking current cryptographic keys — known as Cryptographically Relevant Quantum Computers (CRQCs) — did not exist and seemed distant. However, recent advances in quantum hardware by multiple global technology companies and research institutions, combined with growing investment from nation-states, have compressed the estimated timeline for CRQC emergence to approximately 7-15 years, with some researchers suggesting even shorter horizons for specific attack scenarios.
On August 13, 2024, the U.S. National Institute of Standards and Technology (NIST) published three finalized Federal Information Processing Standards (FIPS) for post-quantum cryptography, the culmination of an eight-year global standardization effort that began with a public call for proposals in December 2016. In March 2025, NIST selected HQC as a fifth algorithm, providing backup capability based on a fundamentally different mathematical approach. These publications represent the single most significant change in cryptographic standards since the adoption of RSA and Diffie-Hellman in the 1970s.
This article is written from the perspective of a cybersecurity practitioner who has spent the past two decades evaluating cryptographic implementations in production environments. Over the course of more than 1,000 security evaluation projects — encompassing penetration tests, security audits, vulnerability assessments, and compliance evaluations — across banking institutions, central banks, government agencies, energy utilities, telecommunications providers, and industrial control system operators, I have developed a detailed understanding of how cryptography is actually deployed, configured, and maintained in the real world.
This perspective is distinct from, and complementary to, the theoretical cryptographic analysis that dominates academic literature on PQC. While academic research focuses on the mathematical security properties of new algorithms, this article focuses on the practical questions that security teams, IT architects, and compliance officers face: Which systems are affected? How disruptive is the migration? What are the performance implications? Where do we start? What does the regulatory timeline look like?
The analysis draws on direct experience with internet banking platforms, corporate e-banking desktop clients, mobile banking applications, instant payment systems, card payment infrastructure (including EMV and HSM-based transaction processing), SCADA and industrial control systems in the energy sector, government cloud platforms, digital identity infrastructure, and national-scale information systems. All references to specific systems and organizations have been anonymized to protect client confidentiality.
The article is organized as follows: Section 2 provides a technical overview of the quantum threat to current cryptography. Section 3 details the NIST PQC standards and their characteristics. Section 4 analyzes the impact on banking infrastructure. Section 5 examines the implications for critical infrastructure and industrial control systems. Section 6 addresses the European regulatory landscape. Section 7 discusses crypto-agility and migration strategies. Section 8 provides a comprehensive set of recommendations. Section 9 concludes with a forward-looking assessment.
To appreciate the scale of the PQC migration challenge, it is necessary to understand just how deeply the affected algorithms are embedded in modern digital infrastructure. Through my security assessments, I routinely evaluate the following cryptographic mechanisms, all of which are vulnerable to quantum attacks:
RSA (Rivest-Shamir-Adleman): Used for key exchange (RSA-OAEP), digital signatures (RSA-PSS, PKCS#1 v1.5), and certificate signing in PKI hierarchies. RSA-2048 and RSA-4096 are the dominant key sizes in the banking and government environments I audit. RSA is the most widely deployed asymmetric algorithm globally and appears in TLS certificates, code signing certificates, S/MIME email encryption, XML digital signatures, PDF signing, and countless application-layer protocols.
ECDSA and EdDSA (Elliptic Curve Digital Signature Algorithm): Increasingly used for TLS server authentication, client certificates, and code signing. ECDSA with P-256 and P-384 curves is the standard configuration I encounter in modern banking web applications and API gateways. EdDSA (particularly Ed25519) is gaining adoption in SSH and newer protocols.
ECDHE and X25519 (Elliptic Curve Diffie-Hellman Ephemeral): The dominant key exchange mechanism in modern TLS deployments. In my assessments of banking platforms, I consistently find that ECDHE with P-256 or X25519 is the negotiated key exchange in over 90% of TLS connections. The “ephemeral” property provides forward secrecy — meaning that compromise of the server’s long-term key does not retroactively compromise past sessions — but this protection is irrelevant against a quantum adversary who can directly solve the discrete logarithm problem.
Diffie-Hellman and DSA (classical variants): Still encountered in legacy systems, particularly in government infrastructure and older VPN configurations. During my audits of national-scale government platforms, I regularly find DH-1024 and DH-2048 in IPsec VPN configurations and in legacy TLS setups.
Key Agreement and Key Transport mechanisms in payment systems: EMV card transactions use RSA-based key transport for online PIN encryption and certificate verification. The card payment ecosystem’s global scale and long hardware replacement cycles make it particularly vulnerable to the PQC transition.
The most immediate quantum threat is not the direct breaking of real-time encrypted communications — which requires a CRQC that does not yet exist — but rather the “harvest now, decrypt later” (HNDL) strategy. In this threat model, adversaries intercept and store encrypted traffic today, with the intention of decrypting it once quantum computing capability becomes available.
This threat is particularly relevant for data with long confidentiality requirements. In my work with banking institutions, I encounter data that must remain confidential for 10-30 years (customer financial records, transaction histories). Government classified information often has even longer protection requirements. If such data is transmitted over networks using RSA or ECDHE key exchange, and if that traffic is being captured by an adversary, the data’s confidentiality has an expiration date tied to the emergence of a CRQC.
The HNDL threat means that the effective date of quantum vulnerability is not the date when a CRQC is built, but rather the date when the data ceases to require confidentiality minus the time required to complete the migration. For data requiring 20 years of confidentiality, and assuming a CRQC emerges in 10 years, the migration deadline has effectively already passed.
It is important to note that symmetric cryptographic algorithms (AES, ChaCha20) and hash functions (SHA-256, SHA-3) are significantly less affected by quantum computing. Grover’s algorithm provides a quadratic speedup for brute-force search against symmetric keys, effectively halving the security level (AES-128 provides roughly 64 bits of security against a quantum adversary). However, this is easily addressed by using larger key sizes: AES-256 provides 128 bits of post-quantum security, which is considered adequate.
This distinction matters for system architects planning the PQC transition: the focus should be on asymmetric algorithms used for key exchange, digital signatures, and public-key encryption. Symmetric encryption at rest (such as AES-256 disk encryption or database column encryption) and hash-based password storage (bcrypt, Argon2) do not require fundamental algorithm changes, though review of key sizes is advisable.
In the context of PCI DSS audits that I perform, this means that Requirement 3 (protection of stored data) using AES-256 encryption is largely quantum-resistant, while Requirement 4 (encryption of data in transit) using TLS with RSA or ECDHE key exchange is vulnerable and must be migrated.
NIST’s Post-Quantum Cryptography Standardization Project, initiated in December 2016, was the most significant open cryptographic competition since the selection of AES in 2001. The process received 82 initial submissions, which were evaluated over four rounds spanning eight years. The transparency and rigor of this process — involving public review, extensive cryptanalysis, and performance benchmarking — provides a high degree of confidence in the selected algorithms.
The three finalized standards, published on August 13, 2024, represent the culmination of this effort. A fourth standard based on FALCON is under development (expected as FIPS 206), and HQC was selected in March 2025 as a backup key encapsulation mechanism.
FIPS 203 specifies ML-KEM (formerly known as CRYSTALS-Kyber), the primary standard for general encryption and key establishment. ML-KEM is a key encapsulation mechanism (KEM) — a specific type of key establishment scheme that enables two parties to securely establish a shared secret key over a public channel.
ML-KEM’s security is based on the Module Learning With Errors (MLWE) problem, a lattice-based mathematical problem that is believed to be resistant to both classical and quantum attacks. The standard defines three parameter sets:
| Parameter Set | Security Level | Public Key Size | Ciphertext Size | Shared Secret |
|---|---|---|---|---|
| ML-KEM-512 | NIST Level 1 (~AES-128) | 800 bytes | 768 bytes | 32 bytes |
| ML-KEM-768 | NIST Level 3 (~AES-192) | 1,184 bytes | 1,088 bytes | 32 bytes |
| ML-KEM-1024 | NIST Level 5 (~AES-256) | 1,568 bytes | 1,568 bytes | 32 bytes |
For comparison, an X25519 key exchange uses 32-byte public keys and 32-byte shared secrets. The increase in data sizes — roughly 37x larger public keys for ML-KEM-768 compared to X25519 — has direct implications for TLS handshake performance, certificate sizes, and network bandwidth.
In my penetration testing work, I evaluate TLS handshake parameters in detail. The transition from X25519 to ML-KEM-768 increases the key exchange portion of the TLS handshake from approximately 64 bytes to approximately 2,272 bytes. While this is manageable for modern broadband connections, it can be significant for constrained environments, IoT devices, and high-frequency transaction systems.
FIPS 204 specifies ML-DSA (formerly CRYSTALS-Dilithium), the primary standard for digital signatures. Digital signatures provide authentication, integrity, and non-repudiation — properties that are essential for certificate-based PKI, financial transaction authorization, document signing, and software verification.
ML-DSA is also lattice-based, relying on the MLWE and Module Short Integer Solution (MSIS) problems. It defines three parameter sets:
| Parameter Set | Security Level | Public Key Size | Signature Size | Signing Speed | Verification Speed |
|---|---|---|---|---|---|
| ML-DSA-44 | NIST Level 2 | 1,312 bytes | 2,420 bytes | Fast | Fast |
| ML-DSA-65 | NIST Level 3 | 1,952 bytes | 3,309 bytes | Fast | Fast |
| ML-DSA-87 | NIST Level 5 | 2,592 bytes | 4,627 bytes | Fast | Fast |
For comparison, an ECDSA P-256 signature is 64 bytes with a 32-byte public key. The ML-DSA-65 signature is approximately 52x larger, and the public key is approximately 61x larger. These increases have significant implications for certificate chains (where multiple signatures are verified in sequence), for systems that transmit signed data over bandwidth-constrained channels, and for storage systems that archive signed documents.
In the banking environments I audit, digital signatures are used for transaction authorization in internet banking platforms, for corporate e-banking batch payment signing, for inter-bank settlement messages, and for regulatory reporting. The increased signature sizes will affect message formats, database schemas, and archive storage capacities.
FIPS 205 specifies SLH-DSA (formerly SPHINCS+), a hash-based digital signature scheme. Unlike the lattice-based algorithms, SLH-DSA’s security relies solely on the security of the underlying hash function — a property known as “minimal security assumptions.” This makes SLH-DSA extremely conservative and well-suited as a fallback if vulnerabilities are discovered in lattice-based schemes.
However, this conservatism comes at a significant performance cost. SLH-DSA signatures are much larger than ML-DSA signatures — ranging from approximately 7,856 bytes to 49,856 bytes depending on the parameter set — and signing operations are substantially slower. SLH-DSA is therefore recommended primarily as a backup or for applications where security confidence is paramount and performance is secondary.
In March 2025, NIST selected HQC (Hamming Quasi-Cyclic) as a fifth PQC algorithm, intended as a backup for ML-KEM. HQC is based on error-correcting codes rather than lattice problems, providing algorithm diversity. If a mathematical breakthrough were to undermine the lattice-based security assumptions of ML-KEM and ML-DSA, HQC (and SLH-DSA for signatures) would provide alternative security foundations. NIST expects to publish a draft standard for HQC by 2026, with finalization by 2027.
FALCON is an additional lattice-based digital signature scheme that produces smaller signatures than ML-DSA (approximately 666 bytes for the security level roughly equivalent to ML-DSA-44). Its primary advantage is compact signature size, which is particularly valuable for certificate chains and bandwidth-constrained environments. However, FALCON’s implementation is more complex and requires careful handling of floating-point arithmetic during signing, which presents challenges for embedded systems and hardware implementations.
TLS is the cryptographic backbone of modern banking. Every internet banking session, every mobile banking API call, every inter-bank communication channel, and every payment gateway connection relies on TLS for confidentiality and integrity. In my security assessments, I evaluate TLS configurations across dozens of endpoints per engagement — web applications, API gateways, middleware services, database connections, and third-party integrations.
The PQC transition affects TLS in two fundamental ways: the key exchange mechanism (currently ECDHE/X25519, migrating to ML-KEM) and the authentication mechanism (currently RSA/ECDSA certificates, migrating to ML-DSA/FALCON certificates).
Key Exchange Migration. The most immediate action is migrating TLS key exchange to hybrid mode — combining a classical algorithm (X25519 or P-256) with a post-quantum algorithm (ML-KEM-768) in a single handshake. This approach, documented in IETF drafts and already implemented in major TLS libraries (OpenSSL 3.5+, BoringSSL, wolfSSL), provides protection against HNDL attacks while maintaining backward compatibility.
During my penetration testing engagements, I regularly assess the TLS cipher suite configuration of banking platforms. The recommended migration path is to enable hybrid key exchange groups (such as X25519Kyber768Draft00 or the finalized X25519MLKEM768 group) as the preferred key exchange mechanism while maintaining classical-only groups as fallback for incompatible clients.
The practical challenges I have observed include: middlebox interference (network inspection devices that fail to process larger handshake messages), load balancer configuration limits on handshake buffer sizes, web application firewall rules that flag oversized ClientHello messages, and performance monitoring systems that generate false alerts due to increased handshake latency.
Certificate Chain Migration. Migrating TLS server certificates to post-quantum signature algorithms is a more complex undertaking. A typical TLS connection verifies a chain of 2-3 certificates (end-entity, intermediate CA, root CA). If each certificate uses ML-DSA-65 signatures and public keys, the total certificate chain size increases from approximately 3-4 KB (with ECDSA P-256) to approximately 20-25 KB. For connections that also require client certificate authentication (mutual TLS, which I encounter in inter-bank communications and payment processing), the overhead doubles.
The recommended approach is to first migrate to hybrid certificates — certificates that contain both a classical and a post-quantum signature — followed by a full transition to PQC-only certificates once the ecosystem is ready. The WebPKI (Certificate Authority/Browser Forum) timeline for PQC certificate adoption is still under discussion, but browser vendors are actively implementing and testing PQC certificate verification.
Banking PKI infrastructure is one of the most complex and security-critical components I evaluate during security audits. A typical banking PKI deployment includes root certificate authorities (often stored in offline HSMs), intermediate CAs for different purposes (TLS, code signing, document signing, client authentication), certificate lifecycle management systems, OCSP responders and CRL distribution points, and integration with hardware security modules for all signing operations.
The PQC migration of PKI infrastructure is particularly challenging because of its hierarchical nature. Root CA certificates typically have 20-25 year validity periods and are embedded in trust stores across every device that communicates with the bank — including customer browsers, mobile devices, ATMs, point-of-sale terminals, and partner systems. Migrating a root CA to post-quantum algorithms requires either a “flag day” transition (where all relying parties switch simultaneously) or a parallel hierarchy approach (where a new PQC root CA is established alongside the existing classical root, with cross-signing during the transition period).
In my PKI audits, I evaluate certificate lifecycle processes, revocation mechanisms (CRL and OCSP), and key management practices. The PQC transition adds significant complexity to each of these areas. OCSP responses, for example, are signed by the CA — if ML-DSA-65 is used, each OCSP response increases from approximately 500 bytes to approximately 5 KB. For banks that process millions of certificate validations per day, this has meaningful infrastructure scaling implications.
HSMs are the cryptographic anchors of banking infrastructure. In my assessments, I evaluate HSM deployments in contexts including root CA key protection, transaction signing (particularly in instant payment systems and SWIFT messaging), PIN processing in card payment infrastructure, key management for data encryption, and code signing for software distribution.
The PQC readiness of HSMs varies significantly by vendor and model. Leading HSM vendors (including Thales, Utimaco, Entrust, and Futurex) have announced PQC support through firmware updates for recent hardware generations. However, older HSM models — which are common in banking environments with 10-15 year hardware refresh cycles — may require physical replacement.
A critical consideration that I emphasize in my audit reports is the need for HSM firmware updates to support not just PQC algorithm execution, but also PQC key generation and key import/export in standardized formats (such as PKCS#11 and KMIP). During my assessments of EMV card payment processing infrastructure, I have observed that HSM configurations are deeply integrated with payment network specifications — meaning that PQC migration must be coordinated not just internally, but across the entire payment card ecosystem.
Financial transaction authorization relies heavily on digital signatures. In my security assessments of internet banking platforms, I evaluate how transactions are signed, how signatures are verified, and how signing keys are protected. Common patterns include server-side RSA or ECDSA signing of transaction records for audit trails and non-repudiation, client-side signing using smart cards or software tokens for corporate banking batch payments, inter-bank message signing (particularly for SWIFT and SEPA messaging), and regulatory reporting with qualified electronic signatures.
The migration of transaction signing to ML-DSA affects not just the signing operation itself, but the entire data flow: message formats must accommodate larger signatures, databases must be resized for larger signed records, archive systems must accommodate increased storage requirements, and downstream systems that verify signatures must be updated to support the new algorithms.
For instant payment systems — which I have evaluated in central banking environments and which process transactions with strict latency requirements (typically under 10 seconds end-to-end) — the performance impact of larger signatures must be carefully benchmarked. ML-DSA signing and verification operations are reasonably fast on modern hardware, but the increased data sizes affect network transmission time, database write performance, and message queue throughput.
Mobile banking applications present unique PQC challenges. During my mobile application security assessments (conducted using OWASP MSTG methodology with tools including Frida, objection, and Burp Suite), I evaluate certificate pinning implementations, TLS configuration, key storage mechanisms, and cryptographic API usage.
Certificate pinning — where the mobile application verifies that the server’s certificate matches a known value — must be updated to accommodate post-quantum certificates. Applications that pin specific public keys or certificate hashes will fail when the server migrates to PQC certificates unless the pinning configuration is updated in advance. This requires coordinated application updates and server certificate rotation.
Mobile platforms (iOS and Android) are adding PQC support to their TLS stacks. Apple’s iMessage already uses PQC hybrid encryption (PQ3 protocol), and Android’s conscrypt TLS library is adding ML-KEM support. However, the diverse Android ecosystem means that older devices with outdated TLS libraries may not support PQC for years after standardization.
PCI DSS v4.0, which I apply in my compliance audits of banking card processing environments, requires the use of “strong cryptography” for protecting cardholder data in storage (Requirement 3) and in transit (Requirement 4). The standard does not currently mandate specific post-quantum algorithms, but its requirement for “strong cryptography” is defined by reference to industry standards — which increasingly means NIST FIPS publications.
As NIST progresses toward deprecating classical algorithms (the timeline outlined in NIST IR 8547 suggests deprecation of RSA-2048 and ECDSA P-256 by 2035), the definition of “strong cryptography” under PCI DSS will evolve. Organizations that proactively migrate to PQC will avoid a scrambled compliance transition; those that wait may face the challenge of migrating under time pressure while maintaining continuous compliance.
Requirement 12.3.3, introduced in PCI DSS v4.0, requires organizations to maintain a cryptographic cipher suite and protocol inventory. This requirement — which I verify during every PCI DSS audit — directly supports PQC migration planning by ensuring that organizations know exactly which algorithms are in use across their cardholder data environment.
My work auditing critical infrastructure — particularly energy sector entities including national grid operators, power generation facilities, district heating companies, and electricity distribution networks — has exposed the fundamentally different character of the PQC challenge in operational technology (OT) environments compared to enterprise IT.
IT environments typically feature standardized protocols (TLS, SSH, IPsec), regular software update cycles, and relatively homogeneous hardware. OT environments, by contrast, feature proprietary industrial protocols (Modbus/TCP, DNP3, IEC 61850, IEC 60870-5-104), hardware with 15-25 year replacement cycles, strict real-time performance requirements, and a cultural and regulatory emphasis on availability over confidentiality.
In my SCADA audit assessments, I have observed that many industrial protocols were designed without cryptographic protection and rely on network segmentation for security. Where cryptography does exist in industrial environments — primarily through IEC 62351 for power system communication security, OPC UA security profiles, and VPN tunnels encapsulating SCADA traffic — the implementations often use RSA or ECDSA for authentication and key exchange.
IEC 62351 is the primary international standard for securing power system communications. Its various parts specify security mechanisms for different communication protocols: IEC 62351-3 covers TLS for TCP/IP-based protocols, IEC 62351-4 covers MMS (Manufacturing Message Specification), IEC 62351-5 covers IEC 60870-5 and DNP3, and IEC 62351-6 covers IEC 61850. The standard currently specifies RSA and ECDSA for authentication. Migrating to PQC algorithms will require updates to the IEC 62351 standard itself, followed by implementation in industrial equipment firmware.
OPC UA (Unified Architecture), widely used for industrial data exchange, supports multiple security profiles that include RSA for key exchange and digital signatures. The OPC Foundation has begun discussing PQC readiness, but standardized PQC security profiles are not yet available.
VPN tunnels (IPsec/IKEv2) are commonly used to protect SCADA traffic traversing public or semi-public networks. In my assessments, I evaluate VPN configurations including cipher suites, key exchange mechanisms, and authentication methods. IPsec/IKEv2 uses DH or ECDH for key exchange and RSA or ECDSA for authentication — both vulnerable to quantum attacks. The IETF has published RFC 9370 (Multiple Key Exchanges in IKEv2) and RFC 9242 (Intermediate Exchange in IKEv2), which provide mechanisms for incorporating PQC key exchange into IPsec VPN connections.
Industrial environments frequently include constrained devices — PLCs, RTUs, IEDs (Intelligent Electronic Devices), and sensors — with limited processing power, memory, and communication bandwidth. These devices often run on microcontrollers with clock speeds under 100 MHz and RAM measured in kilobytes.
The increased key and signature sizes of post-quantum algorithms pose particular challenges for these devices. An ML-KEM-768 key exchange adds approximately 2.2 KB to a handshake that may be constrained to a few hundred bytes by the communication protocol. ML-DSA-65 signatures at 3.3 KB may exceed the maximum authenticated data unit size of some industrial protocols.
For these environments, the most practical near-term approach is to implement PQC at the network perimeter (VPN gateways, protocol converters, security appliances) while maintaining classical cryptography on the constrained device-to-gateway links within the secured network zone. This is consistent with the defense-in-depth architecture I recommend in my audit reports, where SCADA traffic is protected by multiple layers of security.
In my recent work developing recommendations for Security Operations Center (SOC) integration in energy sector entities, I have identified the need for SOC tooling to monitor and report on cryptographic algorithm usage across the monitored infrastructure. This capability becomes critical during the PQC transition, as organizations need visibility into which systems are still using quantum-vulnerable algorithms.
SIEM correlation rules can be developed to flag TLS connections using classical-only key exchange (indicating systems that have not been migrated), to detect mixed-mode environments where some endpoints support PQC and others do not, and to identify unauthorized downgrades from hybrid to classical-only cipher suites. Network monitoring tools that perform deep packet inspection of TLS handshakes can be configured to report on key exchange algorithm usage across the enterprise.
The energy sector faces specific regulatory requirements that intersect with the PQC transition. The EU NIS2 Directive classifies energy entities (electricity, oil, gas, hydrogen, district heating) as “essential entities” subject to heightened cybersecurity requirements. Article 21 of NIS2 requires these entities to implement security measures that are “state of the art,” which will increasingly encompass PQC readiness.
Additionally, network codes under the EU Electricity Regulation are developing cybersecurity requirements for cross-border electricity exchanges. The European Network of Transmission System Operators for Electricity (ENTSO-E) has published guidelines on cybersecurity for power system operations that reference the need for forward-looking cryptographic practices.
The NIS2 Directive (EU 2022/2555), which replaced the original NIS Directive in January 2023 and whose transposition deadline for member states was October 2024, significantly expands the scope and depth of cybersecurity requirements for essential and important entities across the EU. In my work as a certified cybersecurity auditor under the NIS framework, I assess compliance with these requirements across multiple sectors.
Article 21 of NIS2 requires entities to implement “appropriate and proportionate technical, operational and organisational measures to manage the risks posed to the security of network and information systems.” These measures must be “state of the art” and must take into account “the latest developments” in cybersecurity. As PQC standards mature and migration timelines crystallize, the “state of the art” standard will increasingly encompass post-quantum readiness.
While NIS2 does not yet explicitly mandate PQC, its risk-based approach means that organizations handling data with long confidentiality requirements — or operating systems with long migration timelines — should already be factoring quantum risks into their risk assessments. Regulatory guidance from ENISA on PQC adoption is expected to provide more specific direction in the coming years.
The Digital Operational Resilience Act (Regulation EU 2022/2554), which became applicable on January 17, 2025, establishes a comprehensive framework for ICT risk management in the EU financial sector. DORA applies to credit institutions, payment institutions, investment firms, insurance companies, and their critical ICT service providers.
DORA’s Article 9 requires financial entities to implement policies and procedures for cryptographic controls, including “the encryption of data at rest and in transit” and “secure management of cryptographic keys.” While the regulation does not specify particular algorithms, its requirement for “state of the art” security controls creates a clear trajectory toward PQC adoption.
Of particular relevance is DORA’s requirement for ICT risk management frameworks that identify and assess ICT-related risks, including emerging risks. Quantum computing’s threat to current cryptography is precisely such an emerging risk. Financial entities subject to DORA should already be documenting quantum risk in their ICT risk registers and developing mitigation plans.
The revised eIDAS Regulation (eIDAS 2.0) is updating the framework for electronic identification and trust services across the EU. Qualified electronic signatures (QES), which have the legal equivalence of handwritten signatures, rely on asymmetric cryptography — typically RSA or ECDSA — and are issued by qualified trust service providers (QTSPs) using HSMs.
The PQC transition for QES is particularly significant because of the legal implications. Signatures made with quantum-vulnerable algorithms could theoretically be forged by a quantum adversary, undermining non-repudiation. The European Telecommunications Standards Institute (ETSI) has published technical specifications (ETSI TS 103 744) for PQC in electronic signatures, and the CEN-CENELEC standardization process is developing European standards for PQC deployment.
For QTSPs and organizations that rely on qualified signatures for legal and regulatory purposes, the PQC transition must be planned with particular care to maintain the legal standing of signed documents throughout and after the migration.
The European Cybersecurity Competence Centre (ECCC), established under Regulation (EU) 2021/887, coordinates and funds cybersecurity research and innovation across the EU. The ECCC has identified post-quantum cryptography as a strategic priority area and has launched dedicated funding programmes for PQC infrastructure development, testing, and migration support.
The ECCC’s focus on PQC reflects the EU’s recognition that cryptographic sovereignty — the ability to independently develop, validate, and deploy cryptographic technologies — is a strategic imperative. European research institutions and companies are actively contributing to PQC standardization, developing PQC-capable products, and building expertise in quantum-resistant system design.
This is an area where the European cybersecurity industry — including specialized firms with expertise in cryptographic assessment and infrastructure security — has a critical role to play in bridging the gap between standardized algorithms and operational deployment in banking, energy, government, and other essential sectors.
Crypto-agility — the ability to rapidly switch cryptographic algorithms and parameters without redesigning systems — is the single most important architectural property for managing the PQC transition. In my security assessments, I evaluate crypto-agility as part of the overall architectural review, and I consistently find that it is one of the weakest areas in most organizations.
Common crypto-agility failures that I encounter include: hard-coded algorithm identifiers in application source code, fixed cipher suite configurations in load balancers and reverse proxies, certificate pinning implementations that pin specific algorithms (rather than abstract trust anchors), HSM configurations that are tightly coupled to specific algorithm types, database schemas that allocate fixed-size columns for cryptographic values (keys, signatures, certificates), and message format specifications that do not accommodate variable-size cryptographic fields.
Each of these failures increases the cost and complexity of PQC migration. Organizations that address crypto-agility proactively — by abstracting cryptographic operations behind configurable interfaces, using algorithm-agnostic data structures, and testing alternative algorithms in staging environments — will be able to migrate more quickly and with less disruption.
The recommended transition strategy for most systems is a hybrid approach — combining a classical algorithm with a post-quantum algorithm so that the system remains secure even if one algorithm is compromised. This approach is endorsed by NIST, BSI (Germany), ANSSI (France), and NCSC (United Kingdom).
For TLS, hybrid key exchange (e.g., X25519 + ML-KEM-768) is already supported in major TLS libraries and browsers. For digital signatures, hybrid approaches are more complex — NIST does not yet support hybrid signature verification in FIPS 140-3 validation, but some regional authorities (particularly BSI and ANSSI) recommend hybrid signatures as an interim measure.
For IPsec VPN — which I regularly evaluate in both banking and critical infrastructure environments — RFC 9370 provides a mechanism for performing multiple key exchanges within a single IKEv2 negotiation, enabling hybrid post-quantum key exchange without breaking compatibility with existing VPN infrastructure.
Based on my experience with large-scale infrastructure changes in banking and government environments, I recommend a four-phase migration framework:
Phase 1: Discovery and Assessment (6-12 months). Conduct a comprehensive cryptographic asset inventory. Identify all systems using asymmetric cryptography, map their dependencies, and assess their crypto-agility. Prioritize systems based on data sensitivity, exposure to HNDL threats, and migration complexity.
Phase 2: Hybrid Deployment (12-24 months). Enable hybrid key exchange on external-facing TLS endpoints. Upgrade VPN gateways to support hybrid IKEv2 key exchange. Deploy PQC-capable HSM firmware. Establish a parallel PQC PKI hierarchy. Update mobile applications to support hybrid TLS and PQC certificate validation.
Phase 3: Full PQC Migration (24-48 months). Migrate internal TLS endpoints to PQC-only key exchange. Transition digital signature systems to ML-DSA. Reissue certificates under PQC-only CA hierarchy. Update payment system message formats to accommodate larger signatures. Decommission classical-only cipher suites.
Phase 4: Validation and Optimization (ongoing). Conduct security assessments to verify PQC deployment correctness. Monitor performance impact and optimize configurations. Update compliance documentation. Maintain monitoring for algorithm-specific vulnerabilities.
The post-quantum cryptographic transition is the most significant change in the foundations of digital security since the introduction of public-key cryptography in the 1970s. The finalization of NIST FIPS 203, 204, and 205, combined with evolving European regulations (NIS2, DORA, eIDAS 2.0), has created both the technical standards and the regulatory impetus for organizations to act.
From my perspective as a security practitioner who has spent nearly two decades evaluating cryptographic implementations in banking, government, and critical infrastructure environments, the key message is clear: the migration will be complex, time-consuming, and expensive — but it is not optional. The quantum threat is not a question of “if” but of “when,” and the HNDL threat means that data transmitted today using quantum-vulnerable algorithms is already at risk.
Organizations that begin planning and testing now — conducting cryptographic inventories, enabling hybrid TLS, updating HSM infrastructure, and integrating PQC into their compliance frameworks — will navigate this transition far more effectively than those that wait for regulatory mandates or quantum breakthroughs to force their hand.
The standards are ready. The regulatory framework is forming. The migration must begin.
About the Author
Teodor Lupan is the Founder and Managing Director of SafeByte Consulting SRL, a Romanian cybersecurity firm specializing in penetration testing, security auditing, and vulnerability assessments for banking, government, and critical infrastructure clients. With 19+ years of experience and over 1,000 security evaluation projects at international level, Teodor holds CISA, GCPN, GICSP, OSCP, OSWP, and CISSO certifications and is a certified cybersecurity auditor under the NIS Directive. He has published research on ICS/SCADA security (Intelligence Info, 2024) and co-authored an IEEE conference paper on mobile malware analysis (SpeD 2019). He has contributed as a Subject Matter Expert to the development of the EC-Council Certified Ethical Hacker (CEH) examination, and has led international cyber defense exercises.
Contact: teodor.lupan@safebyte.io | www.safebyte.io
Recent Comments