Journal · 03 / Quantum Computing
The investor's quantum reading
An honest reckoning with where quantum computing actually is in 2026, what the next milestones look like, and where the durable value is most likely to sit.
Quantum computing is the technology investors are most likely to miss, and the most likely to be wrong about if they don’t. The field’s commercial timeline is genuinely longer than the median venture window, but the platforms and intellectual property staked out before commercial relevance arrives will define who participates when it does.
This is an attempt at an honest read.
What is actually true today
There are roughly five hardware modalities competing for primacy: superconducting (IBM, Google, Rigetti), trapped-ion (IonQ, Quantinuum), neutral-atom (QuEra, Atom Computing, Pasqal), photonic (PsiQuantum, Xanadu), and silicon-spin (Intel, several research-stage groups). Each has trade-offs in coherence time, gate fidelity, connectivity, and manufacturability. None has clearly won.
The single most important number in the field right now is the logical qubit count, not the physical qubit count. A logical qubit — one error-corrected unit of quantum information — currently requires somewhere between several hundred and several thousand physical qubits depending on the architecture and the target error rate. The major players are demonstrating their first few logical qubits this year. The threshold at which useful problems become tractable is somewhere between one hundred and a few thousand logical qubits, depending on the algorithm. We are an order of magnitude away on the optimistic end.
This is meaningful. It means that current quantum hardware does not yet do useful work, and won’t for several more years. It also means that the engineering gap between today and that point is well- understood and decreasing.
Where useful problems live
When the hardware arrives, the first applications are likely to cluster in four areas:
- Quantum chemistry and materials science. Simulating molecular behaviour is the original motivating use case and remains the most defensible. Catalyst design, battery chemistry, and drug discovery all sit downstream.
- Optimisation. Logistics, portfolio construction, and certain classes of scheduling problems where classical approximations leave value on the table.
- Machine learning. Quantum-enhanced training and sampling for specific model classes; this is the noisiest area and the one most prone to overclaiming.
- Cryptanalysis. The reason every government and every serious institution is running a post-quantum cryptography migration right now.
The post-quantum cryptography deadline is the most underrated near-term consequence of the field, because it is happening now, regardless of when working hardware actually arrives. NIST standardisation has shipped, regulators are issuing transition timelines, and companies that have not begun migrating their identity, signing, and TLS infrastructure are building risk that compounds quietly.
What we look at
We watch the hardware roadmaps for early signs of an architectural favourite without committing to one prematurely. We are interested in the layer above the hardware — error correction, compilers, libraries, the developer experience — because the firm that owns the abstraction layer when the hardware finally lands captures disproportionate value. We pay particular attention to post-quantum cryptography and the firms helping organisations through the migration; this is real work that has to happen on a calendar that does not depend on quantum hardware existing yet.
The single biggest mistake an investor can make in quantum is to treat it as either imminent or imaginary. It is neither. It is a ten-to-fifteen-year programme with a known shape, known milestones, and a small number of firms quietly accumulating the IP that will matter when it arrives.