What is Post-Quantum Cryptography
Post-Quantum Cryptography refers to new encryption algorithms designed to withstand attacks from future quantum computers. It’s exciting because it protects long-lived data that could be decrypted years from now in a “harvest now, decrypt later” scenario. It’s become a buzzword as companies rush to appear quantum-ready despite timelines remaining uncertain.
What is the adoption maturity
Adoption is emerging but uneven. NIST algorithm selections are stabilising, early pilots are underway, and crypto-agility programmes are growing. Most organisations are still mapping where cryptography is used and assessing compatibility. Full-scale rollout is rare due to performance impacts, tooling limitations, and ecosystem readiness gaps.
What are the barriers to adoption
- Legacy systems with deeply embedded cryptography that resist modification.
- Performance overhead of PQC algorithms in latency-sensitive systems.
- Unclear regulatory timelines and fragmented global guidance.
- Lack of organisation-wide crypto-asset inventories.
- Complexity integrating PQC across multi-cloud and hybrid estates.
- Limited vendor readiness and immature PQC toolchains.
- Skills shortages in applied cryptography and crypto-agility engineering.
- High migration cost before quantum threats fully materialise.
- Interoperability challenges between existing and hybrid PQC protocols.
- Limited hardware acceleration for production-scale PQC workloads.
Specific use cases where it works
- Lattice-based PQC securing satellite communications using hybrid key exchange to deliver quantum-resilient links.
- PQC-enabled TLS for financial services protecting high-risk, long-lived transaction archives.
- PQC-strengthened certificate authorities defending digital identity infrastructure.
- PQC-secured firmware update channels for medical IoT devices ensuring future-proof integrity.
- PQC upgrade of cloud KMS systems enabling quantum-safe key management for regulated workloads.
Specific use cases where it doesn’t work
- PQC in ultra-low-latency trading systems where handshake overhead breaks service-level targets.
- PQC on legacy industrial controllers lacking compute resources to support algorithm load.
- PQC in consumer mobile apps where battery and performance impact causes abandonment.
- Early PQC in automotive ECUs failing real-time validation due to processing constraints.
- PQC in smart-card payment terminals failing certification because standards remain in flux.
What questions you need to ask yourself before considering adoption over the next 12 months
- Which of our systems store data that must remain secure for 10–20 years?
- Do we have a full inventory of where cryptography is embedded across our estate?
- Is our architecture crypto-agile enough to rotate algorithms without disruption?
- Are our vendors aligned with NIST PQC selections and timelines?
- What are the performance implications of PQC on our critical workloads?
- How will PQC interact with identity, KMS, and PKI systems we already run?
- What regulatory expectations are emerging in our sector around quantum readiness?
- What is the cost of delaying migration versus acting now?
- Do we have governance for ongoing algorithm updates and hybrid deployments?
- Are we prepared for PQC-ready hardware requirements over the next two cycles?
Positive case study
In 2025, Accenture Ventures invested in QuSecure, building on their earlier multi-orbit satellite link secured with NIST-aligned PQC and a quantum-safe banking pilot with Banco Sabadell. The collaboration demonstrated crypto-agility at scale and validated PQC for mission-critical communication and financial data protection.
Negative case study
Recent Google and Cloudflare experiments integrating post-quantum key exchange into TLS revealed significant handshake overhead and operational complexity. While successful in principle, they showed that PQC cannot be dropped into internet-scale systems without architectural redesign, offering valuable lessons on performance and compatibility limits.



