Adiabatic Quantum Computing (AQC) and Its Impact on Cybersecurity
Table of Contents
Introduction
When we discuss quantum computing, we most often refer to Universal Quantum Computing, also known as Gate-Based Quantum Computing. This is the familiar model of quantum computing which uses quantum gates to perform operations on qubits in a similar way classical computers manipulate classical bits. This flavor of quantum computing is known as “universal” because, in theory, it can perform any computation that a classical computer can, but potentially much faster for certain types of problems.
That’s not the only model of quantum computing, however. But let’s start from the beginning.
Factorization and Classical Computers
The integer factorization problem reduces an integer N to its prime factors. Finding these prime factors is a computationally hard problem. In other words, there is no simple formula to find such prime factors. Algorithms for factorization all require many computational operations. This requirement is mathematically proven, the algorithms are deterministic, and because of the difficulty of this mathematical problem, this is used as the basic hardness assumption for many encryption methods in use today. The fastest known classical algorithm for integer factorization is the general number field sieve (GNFS) which scales exponentially in the number of operations required with respect to the size of the integer to be factored. It’s easy to see how increasing the bits in the integer N, exponentially increases the computing requirement. For example, RSA-2048 with a 2048-bit size key would take billions of years of processing on a classical computer. The precise number would depending on the computer’s processing power and various algorithm optimizations. In any case, somewhat beyond the patience of any adversary that might target you.
On a related note, if you want to learn more about the development of (classical) factoring algorithms and the number field sieve, Carl Pomerance’s paper “A tale of two sieves” is a beautiful introduction to the topic.
Universal Quantum Computing
Enter quantum computing, which offers a potential game-changer for this scenario. Quantum computing operates under entirely different principles from classical computing, utilizing properties like superposition and entanglement. This allows it to perform many calculations in parallel, dramatically speeding up certain processes, including integer factorization.
Quantum computing theory has shown the potential to reduce the number of operations required for solving different problems, including the integer factorization. The quantum methods for solving factorization problem could be regarded as probabilistic methods, compared to the classical deterministic methods.
Shor’s algorithm, the first developed and perhaps the most well-known quantum algorithm for integer factorization, was introduced by Peter Shor, a mathematician working at AT&T’s Bell Labs, in his 1994 paper, “Algorithms for Quantum Computation: Discrete Logarithms and Factoring.” This algorithm requires a polynomial number of operations, thus being able to factorize large integers exponentially faster than the General Number Field Sieve (GNFS), cutting the required time down to minutes or seconds.
Shor’s algorithm demonstrated that quantum computers can provide a shortcut to breaking all encryption models that rely on the difficulty of factoring large prime numbers, such as the widely used RSA encryption.
Researchers have continued to improve upon Shor’s algorithm and have demonstrated the algorithm on various flavors of quantum computers, albeit by factoring much smaller integers than those currently used for encryption.
As of the current state of quantum computing development, Gidney and Ekerå in 2019 achieved a breakthrough in algorithm optimization, estimating that with their approach “only” 20 million (noisy) qubits were needed to factor a 2,048-bit RSA integer. At our current state of quantum computer hardware development, we have quantum computers with only a few hundred noisy, error-prone qubits that lose their coherence in milliseconds. These capabilities are nowhere close to challenging RSA-2048. Yet.
In discussions about when quantum computing will become a significant risk to cryptography, we often focus on the progress of universal quantum computing and attempt to extrapolate this to predict when a quantum computer could realistically challenge RSA-2048 encryption. RSA-2048 often being referenced in these discussions as the first cryptographically relevant number. However, this approach might be misleading.
Adiabatic Quantum Computing (AQC) and Quantum Annealing
Adiabatic Quantum Computing (AQC), and its variant Quantum Annealing, are another model for quantum computation. AQC was originally proposed by Farhi et al. in 2001 in their paper “A Quantum Adiabatic Evolution Algorithm Applied to Random Instances of an NP-Complete Problem.” It’s a specialized subset of quantum computing focused on solving optimization problems by finding the minimum (or maximum) of a given function over a set of possible solutions. For problems that can be presented as optimization problems, such as 3-SAT problem, quantum database search problem, and yes, the factoring problem we are worried about, quantum annealers have shown great potential in solving them in a way that classical computers struggle with.
The evolution of AQC and Quantum Annealing is progressing at its own pace. Vendors such as D-Wave have already deployed quantum annealers commercially to help solve specific narrow problems in a much more efficient way than classical computing.
When predicting Q-Day—the day when quantum computers will be able to break current public key encryption algorithms—in addition to following the progress in Universal Quantum Computing, we must also consider the rapid development of adiabatic quantum computing (AQC) hardware and the evolution of algorithms that could potentially challenge RSA-2048 in a manner suitable for AQC. Researchers have been exploring methods to transform factorization into an optimization problem, which could circumvent the need for Shor’s algorithm. One of the earliest attempts to reframe factorization as an optimization problem was published by Microsoft in 2002, in their paper titled “Factoring as Optimization.”
Subsequent research has continued to refine these algorithms, progressing alongside hardware development. Most notably, in 2019, a team of Chinese scientists published a paper titled “Factoring larger integers with fewer qubits via quantum annealing with optimized parameters.” In this study, they successfully factored the number 1,005,973—a 20-bit number—using only 89 noisy qubits. In contrast, using Shor’s algorithm for the same calculation would require many thousands of noisy qubits. Their findings suggest that quantum annealers, such as those produced by D-Wave, could pose a more immediate threat to RSA encryption than universal, gate-based quantum computers, which are still years away from challenging RSA.
For reference, D-Wave the latest Advantage line of quantum computers will have 5,640 qubits.
Given that gate-based, universal quantum computers require comprehensive error correction to factor large integers using Shor’s algorithm—a capability still beyond our current technical reach—it appears that AQC and quantum annealers might soon present a more immediate risk to cryptographic security