Google's recent announcement of a joint venture with the National Aeronautics and Space Administration (NASA) to build a new Quantum Artificial Intelligence Lab has focused public attention on D-Wave, the small Canadian company that claims to have built the world's first commercial quantum computer.
Founded in 1999 and backed by Amazon's Jeff Bezos as well as the investment arm of the CIA, D-Wave made news in 2011 when it sold an earlier version of its computer to Lockheed Martin at a rumored cost of about $10 million. Researchers at Harvard University and the University of Southern California have also conducted experiments using D-Wave computers.
These high-profile engagements have attracted plenty of attention for the Vancouver-area firm, but D-Wave's aggressive marketing tactics have also rankled many experts in the quantum computing community, who have questioned the company's unorthodox approach and its ability to deliver the kind of dramatic performance gains they expect from a quantum computer.
Ever since Richard Feynman first popularized the idea of quantum computers in 1982, debates have swirled about the likelihood of bringing a working quantum computer to market. For years, the topic remained primarily a theoretical pursuit. That has started to change in recent years, however, as companies like IBM, Microsoft and Hewlett-Packard have started to invest heavily in developing real-world quantum computers.
In theory, quantum computers hold enormous promise. Unlike traditional (or classical) computers that rely on bits to express concrete values of one or zero, quantum computers leverage the weird rules of quantum physics to employ so-called quantum bits (qubits) that can represent both one and zero at the same time. This paradoxical property makes it possible to consider every possible combination of values simultaneously, and thus solve complex computational problems in an instant.
That is the theory, anyway. However, those performance gains come with an important catch. Qubits are highly susceptible to "noise," or interference from the outside environment. To combat that problem, D-Wave has taken an unusual approach known as quantum annealing, based on the adiabatic algorithm proposed by the Massachussetts Institute of Technology’s (MIT) Edward Fahri in 2000.
By using qubits made from niobium loops cooled to near-absolute zero, D-Wave's computer limits interference by keeping the qubits near their lowest possible energy state. Programmers then express problems in the form of figurative hills and valleys, with the valleys representing the optimum solution to a given problem. The supercooled qubits behave like water flowing into the valleys, finding the lowest points that represent the best available solutions.
Quantum annealing is well suited to optimization problems like voice recognition, encryption, and planning space flights. But this approach does not lend itself to factorizing large numbers, and is therefore not likely to produce the "universal" quantum computer that many researchers have long imagined: one capable of solving any problem. Instead, D-Wave has built a special-purpose machine that is particularly well suited to a certain kind of calculations.
That limitation notwithstanding, quantum annealing does seem to offer a theoretically powerful solution to a wide range of applied computing problems. However, critics have questioned whether D-Wave's approach will ever yield the dramatic processing speeds that the company has promised.
"Google and NASA have bought this machine and they claim that they're going to use it to solve classical problems faster than they could with any other computer. That's been uncritically reported around the world, but the evidence does not support that at all," says Scott Aaronson of MIT, self-appointed "chief skeptic" of D-Wave.
Aaronson points to forthcoming results from an ETH Zurich (Swiss Federal Institute of Technology in Zurich) team led by Matthias Troyer, in which ordinary classical computers managed to outperform a 128-qubit D-Wave computer by a factor of 15. Alex Selby has also published the results of another classical solver capable of outperforming the D-Wave computer.
However, a recent study by Catherine C. McGeoch of Amherst College and Cong Wang of Simon Fraser University found that the D-Wave Two handily outperformed classical desktop computers in solving an optimization problem, performing 3,600 times faster than high-end desktop machines.
Although D-Wave's speed gains remain a matter of debate, even the company's most vocal critics now concede that there are indeed quantum effects taking place within the machine.
"It is clear that there is some kind of quantum behavior in the device," says Aaronson, "but that's not very surprising. At a small enough level, everything behaves quantumly. The thing you really want to know is, are the quantum effects relevant? Are they paying a causal role in the computation? These things are much iffier than many people might like them to be."
Quantum computing pioneer Umesh Vazirani of the University of California, Berkeley shares many of the same concerns about D-Wave's marketing claims, but he is also willing to give the company the benefit of the doubt in the name of scientific inquiry. "Here's a path that has a small chance of success, but it's worth pursuing," he says. "It's an interesting thing, even if there's a .001 chance that it will work. It's worth someone's while to explore it. It's an admirable thing that they are doing."
Still, Vazirani wonders about the balance of pure scientific motives vs. marketing objectives underlying D-Wave's venture with Google and NASA. "Did they get taken in by hype, or did they really know what they were getting into it?"
Aaronson, for all his skepticism, stops short of dismissing the technology altogether. "I don't have a crystal ball," he says, "and many people want me to make a prognostication that this can never work. And I don't know that for sure. But what they've built so far can be easily outperformed by a laptop."
So, can D-Wave really claim to have built the world's first commercial quantum computer? In the end, it may come down to a matter of semantics.
"It’s a debate about words," says Aaronson. "At some point you have to stop haggling over definitions and ask, is it useful for something?"
Alex Wright is a writer and information architect based in Brooklyn, NY.
No entries found