Quantum security is already an operational issue thanks to “harvest now, decrypt later” threats
QKD, PQC and classical systems will coexist — and the weakest link sets the security bar
Quantum safety demands continuous testing
Quantum computing has an image problem. It’s either portrayed as an imminent cryptographic apocalypse or as a sci-fi engine about to invent wormholes and time machines. Somewhere between those extremes lies the real story: quantum technologies are already reshaping security strategy, infrastructure planning and operational design.
In an effort to cut through the noise, Fierce Network spoke with Dr. Sameh Yamany, CTO of Viavi Solutions. The conversation wasn’t about speculative breakthroughs decades away, but about what network operators, security leaders and infrastructure vendors need to do today.
The takeaway? Even though Q-Day — the moment quantum computers break today’s public-key cryptography — is still years away, organizations can’t put off making a quantum plan.
The reality of quantum is more nuanced and more urgent. “Harvest now, decrypt later” threats mean sensitive data being captured today could be cracked in the future. Post-quantum cryptography (PQC) is arriving, quantum key distribution (QKD) is moving from lab to field, and hybrid environments combining classical and quantum techniques are becoming inevitable.
At the center of all of this is something less glamorous but absolutely essential: test and measurement. If quantum is going to transition from research experiment to operational backbone, someone has to validate it, optimize it and continuously monitor it across layers — from single photons in fiber to cryptographic negotiation logic in software.
That’s where Viavi is focused. A full transcript of the conversation is below. Readers can catch a glimpse of Viavi’s quantum tech – including actual photons that you can see – at Mobile World Congress 2026 in Barcelona.
Steve Saunders: We’re hearing a lot about quantum right now. I’d love to get beyond the hype and talk about how people are actually implementing and testing this technology.
Sameh Yamany: We are also hearing a lot about quantum — about Q-Day, about cryptography breaking. But test and measurement is the governance layer that turns experimental quantum into operational capability.
Post-quantum cryptography (PQC) introduces new algorithms, larger keys and different handshakes. That has a performance impact. It may be more secure, but it can require more computational resources. Testing helps organizations understand that impact, validate implementations and optimize performance.
With quantum key distribution (QKD), the challenges move into the physical layer. Now you’re dealing with photonics, single-photon detection, attenuation, entropy and distance limitations. Those constraints must be tested before deployment and then continuously monitored in operation.
Increasingly, environments are hybrid. QKD solves one problem. PQC solves another. Classical cryptography is still present. The weakest link determines the overall security posture. Testing now means validating not just cryptographic primitives, but entire system behavior — fallback paths, negotiation logic, rekey timing and performance under real conditions.
Saunders: Is this as exotic as it sounds, or are you still looking at traditional metrics like performance and compliance?
Yamany: The methods may be exotic. The goals are not.
We are still trying to ensure secure communication that applications can rely on. We are measuring performance, validating security and enabling innovation on top. The mechanisms differ — single photons, entanglement and advanced mathematics — but the objective remains predictable: secure performance.
Performance will be impacted by quantum. Larger keys and more complex algorithms change system behavior. The role of testing is to quantify and optimize that impact.
Saunders: How does testing change in hybrid environments?
Yamany: Traditionally, networks were tested layer by layer — fiber integrity, transport stability and application performance validated independently. Quantum changes that model.
In a hybrid system, a physical-layer event, such as signal attenuation in QKD, can affect key rate. That affects cryptographic operations, which can impact overall system throughput. Testing must now span physical, computational and application layers simultaneously.
That is where digital twin concepts and realistic emulation environments become important, enabling end-to-end performance validation before and during deployment.
Saunders: What advice would you give organizations preparing their security strategies around quantum?
Yamany: First, understand that Q-Day is not the only milestone. The risk may begin years earlier. “Harvest now, decrypt later” means sensitive data captured today could be vulnerable in the future. Organizations should inventory their data and assess its lifespan. Some information is at risk now.
Second, design for crypto agility. Systems must be able to adapt — changing key lifetimes, algorithms and configurations in response to traffic patterns and threat models.
Third, plan for operational impact. Performance will change. Security teams and SecOps organizations will need to adjust workflows accordingly.
Testing becomes continuous. This is no longer just pre-deployment certification. It is ongoing validation.
Saunders: How important is collaboration in advancing quantum security?
Yamany: Quantum security development is not happening in isolation. Organizations are working with standards bodies on PQC algorithms and partnering with vendors integrating quantum-safe mechanisms into firewalls, routers and switches. Governments are establishing compliance labs aligned with sovereign security requirements.
There is also engagement with research institutions to remain aligned with emerging innovation. Quantum readiness spans physics, mathematics, policy and operations, so ecosystem collaboration is necessary.
Saunders: There is increasing discussion about the relationship between quantum and AI. How do you see that evolving?
Yamany: There is growing interest in AI accelerating aspects of quantum development — optimizing error correction, improving materials discovery, tuning control systems and accelerating research cycles.
At the same time, quantum computing may eventually accelerate certain AI workloads, particularly optimization and probabilistic sampling problems.
It is not commercially mature yet, but there is active exploration of how the two fields may influence each other.
Saunders: Where are we today in practical terms?
Yamany: Quantum is moving into implementation. Some countries are instrumenting networks with quantum-ready fiber infrastructure. Cloud-based quantum computing services are emerging, and startups are building around these capabilities.
The focus now is on practical steps — inventorying data, redesigning architectures, anticipating performance tradeoffs and building crypto agility into systems, supported by continuous test and validation.
