D-Wave wants its quantum computer to surpass the performance of traditional computers in the coming years, and has a processor roadmap that could make that happen.
“We’re at a point where we see that our current product is matching the performance of state-of-the-art traditional computers, which have had 70 years of innovation and trillions of dollars of investment. Over the next few years, we should surpass them,” said Jeremy Hilton, D-Wave’s vice president of processor development, in an email statement.
Quantum computing: the future?
D-Wave is perhaps the only company that sells quantum computers, which take a radically different approach to computing compared to today’s servers. Quantum computing has been researched for decades with the goal of building a stable system, and D-Wave introduced what was considered the first ever quantum computer in 2011. Researchers have predicted that quantum computers will replace today’s fastest computers.

D-Wave’s DWave 2 system.
D-Wave last year introduced the second-generation quantum computer called D-Wave Two which has a “list price north of $10 million,” according a research note from financial firm Sterne Agee on Wednesday. The note had a picture of a D-Wave Two with a list price of $15 million.
D-Wave has been secretive about pricing, which a company spokesperson said depends on the customer and needs. But last November the company said the quantum computer could be available for “lease.”
Quantum computing is based on the laws of quantum mechanics, which look at interaction and behavior of matter on atomic and subatomic levels. At the heart of a quantum computer are quantum bits (qubits), which interact with each other, and the goal is to speed up computing by storing and sharing data in more states rather than the usual 0s and 1s in digital computing. But many issues still have to be resolved, one of which is quantum noise, in which qubits are sent into undesirable states after which users lose control of programs being executed.
Quantum computers from D-Wave have been deployed by organizations including Google, NASA and Lockheed-Martin, though it is not clear if the systems were purchased. The D-Wave Two has a 512 qubit processor, and the company has a 1,000 qubit processor undergoing tests in labs and due for release later this year. D-Wave has said it will release a 2,048-core processor in 2015.
“We are laser focused on the performance of the machine, understanding how the technology is working so we can continue to improve it and solve real world problems,” Hilton said.
Pioneering systems are expensive, but the $15 million tag for D-Wave Two is reasonable considering the price of some supercomputers, said Nathan Brookwood, principal analyst at Insight 64.

The D-Wave processor cell.
Early adopters may take an interest in D-Wave’s quantum computers, which will either offer a high reward or be a waste of money, Brookwood said. Adopting a quantum computer is like putting theory to work, and buyers will have to go through many hoops to get the most out of the system.
Who uses this? And who will program it?
Users will have to adapt to the radically different computing approach, which can be intimidating, Brookwood said. The next steps would involve finding uses and applications that fit, and code optimization, which also can be challenging, Brookwood said.
“The reward would have to be great because the risk is just phenomenal,” Brookwood said, who maintained he’s skeptical about quantum computing systems.
D-Wave wants to double its installed base of quantum computers annually, said Sterne Agee financial analyst Alex Kurtz, who met with D-Wave’s management, in the research note.
“The impact that D-Wave Two could have on a specific vertical or market segment is still a moving target and clearly with only three systems deployed today, adoption is still at the very front end of the technology’s life cycle,” the research note said.
D-Wave has pitched its quantum computers as large co-processors ideal for specific calculations in high-performance computing environments. D-Wave is trying to garner interest from the financial services community, and the system could also fit into the SaaS (software as a service) computing model being deployed by companies, Kurtz said.
But in the recent months, the D-Wave’s quantum computer has been subject to debate, with researchers from IBM and the University of California earlier this year raising questions on the relation of D-Wave’s systems to quantum mechanics. The paper, called “How Quantum is the D-Wave Machine?” also question performance of the system based on specific behavior. IBM is doing its own research on building quantum computers.
But recent research papers issued by Texas A&M and a group of researchers from universities, Microsoft and Google backed D-Wave, saying a new array of tools are needed to benchmark quantum computers. Google researchers in January said the D-Wave Two processor solved problems much faster than some conventional problem-solving tools.
D-Wave has said its systems are based on the laws of quantum computing. Magnetic fields are deployed in the quantum computers to perform qubit and multi-qubit operations, which creates a whole set of possible outcomes to solve a particular problem. D-Wave’s Hilton said IBM’s research was not fully fleshed out, and took into account specific uses.
“The 512 qubit processor … was able to meet and match the state-of-the-art classical algorithms and computers even though it has been shown that these particular benchmarking problems will not benefit from a quantum speedup,” Hilton said.
D-Wave in recent month started cooperating with more researchers, and its quantum computers have been the subject of more scientific studies. The company last month joined the National Science Foundation Center for Multicore Productivity Research (CHMPR) at the University of Maryland, Baltimore County, where it will provide “insights” on quantum computing.
But D-Wave’s goal remains making quantum computing and its computers relevant to actual users.
“Our customers are interested in solving real world problems that classical computers are less suited for and are often more complex than what we glean from a straightforward benchmarking test,” Hilton said.