Controversial quantum device maker D-Wave is hoping to find a home for its cutting-edge technology in the high-performance computing (HPC) market.

Colin Williams, the business development director for D-Wave, made a direct appeal to system administrators at the SC13 supercomputing conference, held last week in Denver.

“One of the reasons we’re coming here to talk with you is that we are actively looking for partners in the high-performance computing space,” said Williams (pictured above). “We see quantum computing not as competition with HPC, but the potential with a lot of synergy with HPC people.”

Administrators should think of “quantum computing being a new tool in your arsenal,” Williams said.

Whereas HPC machines are suited for tasks such as computational fluid dynamics or large-scale analytics, quantum computers are best suited for other types of jobs, such as discrete combinatorial operations, Monte Carlo sampling and machine learning, he said.

Every new technology needs a proving ground, a place to test how well it works in day-to-day usage. D-Wave sees its quantum processor as a kind of gigantic co-processor for large HPC systems, one dedicated to certain tasks that would take conventional computers prohibitively long to execute. Williams called this approach “quantum-accelerated HPC.”

Although theorized about for decades, quantum computing for the most part has not been commercialized. D-Wave may be one of the only companies offering a product based on quantum mechanics, that is, based on some of the unique laws of how matter operates on a microscopic level.

D-Wave does not yet offer a general-use quantum computer, but rather offers a system that does what Williams called “quantum annealing.” It is designed to execute one set of problems difficult for classical computers to execute, known as NP-hard (non-deterministic polynomial-time hard) problems. NP-hard problems are informally known as optimization problems, in that they try to find the best overall solution to a problem within a large number of variables, and therefore, possible solutions.

Launched in 1999 in British Columbia, D-Wave released its first commercial 128-qubit computing device in 2011, and followed up with its current product, the 512-qubit D-Wave 2. The D-Wave 2 was purchased by both Google, which tested the system for image recognition, and NASA. Williams did not say how much the D-Wave 2 costs, but did say it could be leased rather than purchased.

A qubit, or quantum bit, is the quantum equivalent of a bit in classical computing. Unlike a traditional bit, which is either a 1 or 0, a qubit can hold both a 1 and a 0 simultaneously, called a superposition. So 512 qubits can hold 2 to the 512th power of computational threads.

Unlike classical computers, however, these threads can’t be read individually—they must be examined collectively as a single unit. As a result, quantum computers are suited “to very different problems,” Williams said.

The company ran pilot projects to show how quantum annealing could help in supercomputing tasks. Both are optimization problems.

In one project with an engineering company, a D-Wave system was able to design a water distribution network, one consisting of a series of water pipes to be installed in a small town.

The task involved finding the optimum balance between cost—the larger the water pipes were, the more expensive they would be to install—and supplying the ideal water pressure to each end user. Williams said the D-Wave approach was able to find a more optimal plan than the one designed by the engineering firm itself.

Another pilot involved radiotherapy optimization for a hospital. This task involved finding the best way to position radiation beams so that they killed the maximum number of cancerous cells in the patient while minimizing damage to healthy cells.

For a co-processor, the D-Wave 2 is big. Most of the components are housed in a box the size of a small shed, which is accompanied by a rack of classical computers to feed information into the box.

The box itself, which is wrapped in radio frequency shielding, contains a dilution refrigerator that houses a quantum processor and a cryostat to keep the processor at about 20 millikelvin above absolute zero in temperature, which is about 1,500 times colder than interstellar space.

“It is one of the quietist, coldest places anywhere in the universe,” Williams said.

The processor itself is made up of loops of niobium, a soft metal, connected together into arrays by Josephson Junctions—a type of coupler. Each loop holds a qubit and the state of each qubit is read by a magnetic field.

To solve a problem, the user translates the problem to be solved into a NP-hard problem that can be solved natively by the machine. This involves recasting the work as a function that can produce a range of values, including the minimal value.

When the problem is run, the system applies a magnetic field across the qubits, putting them into a superposition, and thereby creating all the possible answers to a problem. The magnetic field is then removed, and qubits return to their lowest possible energies, which works to filter out the sub-optimal values.

“Each different solution corresponds to a different energy. You try to find the solution, or bit stream, that has the lowest energy in this landscape,” Williams said. This approach is different from the classical approach in that traditional computers “explore the surface” of the problem, whereas a quantum computer “tunnels through the hills of this landscape” to find the best answer for a problem, he said. This process is then repeated 10,000 times, and the series of results is sent back to a classical computer as a bit stream.

D-Wave’s quantum machine could work with a traditional supercomputer in an iterative manner, Williams suggested. The supercomputer could create a set of simulations using the output from the quantum system, judging which solution was most effective. It could then return this information to the quantum computer, which in turn could adjust its own formula and run another round of tests.

In his talk, Williams admitted that D-Wave’s technology has generated a lot of skepticism, given most other efforts to produce quantum computers have resulted in machines capable of wrangling only a handful of qubits.

“We have a different design philosophy from people doing quantum computing in academia,” Williams said.

Williams described D-Wave’s approach as a top-down one, where the company will design a quantum processor of a certain size, such as 512 qubits, and rapidly revise the design to address shortcomings. This differs from the approach of trying to design a perfect single-qubit processor, and advance work from there.

“The academic community is focused on making the ideal quantum computer. We decided not to do that consciously,” he said.

Williams boasted that D-Wave is beating Moore’s Law. Next year the company plans to introduce a 1,024-qubit processor, and in 2015, it will have a 2,048-qubit processor. The more qubits the system has, the more complex the problems it can tackle, Williams said.