The D-Wave quantum computer has been a focal point of great controversy in the academic and research communities. Back when I was doing my PhD in Quantum Optics and Quantum Information and technically in the race to build a quantum computer, I have to admit to being mildly skeptical of D-Wave’s claim of having made the first commercial quantum computer.
This attitude was a result of two facts – first, for a long time D-Wave staunchly maintained that its research was proprietary and did not allowed open peer review of its work. Secondly, many accomplished and reputable physicists have openly stated, both in the press and to me personally, that in their opinion the D-Wave is not really a quantum computer.
On Dec 8th, Google issued a big press release in collaboration with D-Wave. The bold claim states that the D-Wave ‘quantum annealer’ does in fact beat the equivalent classical algorithm by an astonishing factor of 10^8. The team was nice enough to release a paper on Arxiv explaining the theory and experiment behind the statement. The paper has yet to be peer reviewed, so I decided to check it out myself.
The title of the paper is “The computational value of finite range tunneling“.
Quantum tunneling is a phenomenon that allows a system to cross an energy barrier without actually climbing over the barrier, rather tunneling through it. The most common analogy is throwing a ball at a wall and having it come out on the other side of the wall. This phenomena arises from the fact that according to quantum physics, objects are not rigid entities but clouds of probability density. Therefore what we physically recognize as the ball is simply the region that has the highest probability of finding the ball. However there is a non-zero probability of finding the ball outside this region, say on the other side of the wall.
For large macroscopic objects like balls and walls, this probability is vanishingly small and we would never see this happening even if we tried it for a billion billion years. However when we go down to the microscopic scale, these quantum effects can become quite substantial and quantum tunneling has been observed for particles like electrons and protons.
The paper, as the title states, aims to establish just how useful this quantum effect can be for computing.
And according to the paper, it is very useful when applied to simulated annealing problems. Which begs the question,
What is simulated annealing?
Some computational problems can be posed as an optimization problem, i.e the parameters of the question are embedded into a mathematical function in such a way that the answer to the question is the minimum value of the function.
However, finding the minimum of an arbitrary function is not always easy, especially if the function has multiple local minima. In such cases, sometimes it is preferable to find a good approximate answer rather than the exact answer which might take a long time. This is where the concepts of annealing come into play.
The system is assigned a ‘temperature’, which governs the probability of the system to jump to another state. Then the system is allowed to ‘cool’, i.e the probability of these jumps is slowly decreased over time. As a result, the system spends more time in states which have a low ‘energy’ which is given by the optimization function. Therefore, as the system ‘cools’, it is more likely to reach a state close to the global minimum state than any other.
The origin of this process comes from metallurgy where worked metals are heated and allowed to recrystallize slowly into a low energy, low defect state. Simulated Annealing (SA) is simply this process being simulated on a computer.
The probability that the simulated annealing jumps out of a local minimum well decreases with temperature but also decreases exponentially with the height of the wall surrounding the well. Moreover wells which have a narrow width have a smaller probability of the system landing in them. Therefore SA performs very well when the optimization function has many shallow, wide wells but very poorly when the function has deep, narrow wells.
But guess what works really well if you want to jump through really tall walls? Quantum tunneling. Using a quantum system for annealing, or Quantum Annealing (QA), eases the computational cost of SA by adding an additional method by which our (quantum) ‘ball’ can get from the shallow well to the deeper well. The exponential nature of the cost means that even a small tunneling probability can mean a huge gain in computational power.
That covers the introductory sections of the paper. The bulk of the paper describes the specifics of the function, and the quantum setup they used. The main points can be summarized thus –
- The quantum system used in the D-Wave is a grid of Qbits. A Qbit (or quantum bit) is the quantum analog of a two level system which can be in either state 0 or state 1.
- Different algorithms were tested on the same problem and compared on the number of steps required to reach the answer with 99% confidence. The test candidates were QA, SA, an algorithm called Quantum Monte Carlo (QMC) and other classical algorithms.
- The problem was constructed specifically to cause the worst case performance of SA. Therefore these results do not represent the performance of QA vs SA on an average problem.
- Under these conditions, it was found that QA can have as much as 10^8 times fewer steps required to arrive at the solution than SA or QMC.
- SA is very bad at problems that contain many deep and narrow wells that are separated by tall walls because the jumping probability decreases exponentially with the height of the barriers. Here QA can solve the problem much much faster.
- QA is bad at problems where the wells are far apart, because the tunneling probability decreases exponentially with separation between the wells. Here QA offers no additional benefit over SA.
- The tests were repeated for 180, 296,489,681 and 945 variables to see how the solving time scaled with number of variables.
So back to the original question. Is the D-Wave a quantum computer?
Well, the D-Wave does indeed display some distinctly quantum phenomena such as entanglement and tunneling. However the paper finds that the D-Wave has the same scaling as QMC. I will not elaborate on QMC here except that it can be efficiently solved on a classical computer. The D-Wave is reported to be much faster than the QMC, but by a large polynomial factor, not an exponential one. Therefore the D-Wave does not offer a quantum speedup over classical computers and therefore, strictly speaking, cannot be called a quantum computer.
However whether or not the D-Wave is a true quantum computer is more of an academic question (and a question about the PR strategy of D-Wave).
Optimization problems are a very important class of problems, especially in the context of machine learning and artificial intelligence. Which is probably why Google has pumped so much time and money into testing the D-Wave. It seems that at the very least this investment will yield new insights into the nature of algorithms and quantum computing, if we are lucky it might yield a generation of quantum annealers which are several orders of magnitude faster than today’s computers despite not being ‘true’ quantum computers.
Practically speaking, if the D-Wave indeed has a large advantage over a classical computer on certain problems, then all one cares about is whether those problems are worth solving. The paper suggests a couple of algorithmic problems which might utilize the power of this computer which they hope to expand in the future. They also predict that the they expect that the next generation of quantum annealers would be even more powerful. Whether the D-Wave can become more than an controversial curiosity will depend on their ability to make good on that promise.