The Quantum Advantage

Silicon-based computers will never be able to solve certain problems. This is true even if the number of transistors on a chip doubles every year or two for many years—with corresponding gains in performance.

Merely combining these densely packed chips into multicore CPUs and stringing these CPUs into super-grid platforms does not change the fact that these processors work more or less serially on problems, and many large optimization problems would take more than 10 billion years to solve serially, even on a computer that analyzed a million possible solutions per second.

This optimization conundrum is called the “traveling salesman problem,” which involves figuring out the shortest route to take when traveling to a specific number of cities. When the salesperson’s trip involves only five interconnected cities, there are only 12 possible routes, which can be readily plotted and measured by hand, and the shortest route easily selected from the small list. But when the number of cities increases only slightly—say from five to 25—the number of possible solutions increases exponentially, surpassing the capabilities of a classical computer.

Enter the quantum computer.

Classical computers store numbers—0s and 1s—as directional magnetic fields, working with charges determined by the flow of hundreds of electrons. (This is being reduced by each advance of Moore’s Law, with a natural limit of one electron.) In contrast, Quantum computers work with the state of a single electron, and harness unique, strange characteristics of subatomic particles, namely: superposition, entanglement and interference.

Superposition refers to the fact that quantum bits, or qubits, can simultaneously take on two values. That is, they can take on the value “1” and the value “0” at the same time.

Entanglement refers to the fact that each qubit in a co-generated pair can reflect changes made to the other, no matter how far apart they are. Change the state of a qubit in New York, and its twin in Los Angeles reflects that change instantly.

Interference refers to the way quantum computers can read the results of their computations—the qubits—without altering their value, an essential characteristic. The reason this is important, according to Heisenberg’s uncertainty principle, is that at the minuscule subatomic level, merely examining a particle alters it. Interference allows the result to be read indirectly without changing it.

This may seem inconceivable, but the key points are relatively simple: Quantum computers use the smallest things we know of in the universe to calculate and maintain state, and they can investigate all parallel alternatives of a problem simultaneously. They also use very little energy.