Quantum computing manual

The quantum hardware leaps that still need to happen

"There are no shortcuts."

Inside Rigetti Computing's fabrication lab

There's still plenty of work to be done inside the labs of quantum computing companies.

Photo: Rigetti Computing

Quantum computing once made exciting promises that few thought could materialize. Now, having proved the doubters wrong, it must learn to manage expectations.

"The 100-qubit quantum computer will not change the world right away," says John Preskill, a quantum information theorist at the California Institute of Technology in Pasadena. "We may feel confident that quantum technology will have a substantial impact on society in the decades ahead, but we cannot be nearly so confident about the commercial potential of quantum technology in the near term, say the next five to 10 years."

In part that's because of the limitations of the hardware, especially the quantum bits that encode information. Right now, superconducting qubits are the technology of choice for quantum computers. "There is vanishingly small chance that a fundamentally new qubit technology will enter the race and become competitive across all the fronts where quantum computing technology needs to work," says Chad Rigetti, CEO of Rigetti Computing in Berkeley, which is developing quantum computers for the commercial market — using superconducting qubits, naturally.

But that's not to say that some of the other candidate systems currently being considered for making qubits won't find a place. "There is a long road ahead, and it is too early to declare winners," says computer scientist Umesh Vazirani of the University of California at Berkeley. "It is also quite possible that it may not end up being an either/or situation," he adds — other technologies, such as trapped-ion computers, "have different strengths."

Hardware-aware

IBM's Jerry Chow, manager of the Experimental Quantum Computing Team at the company's Thomas J. Watson Research Center in Yorktown Heights, New York, agrees with Vazirani. Despite IBM's present commitment to superconducting qubits, the company is keeping an eye on what trapped ions can do, and Chow says that they may be better suited to certain problems.

In other words, he explains, the hardware and software for quantum computers are more intimately connected than they are for classical computers, and users might benefit from knowing something of the underlying physics so they can choose the most appropriate hardware for their problem and tailor the algorithms accordingly. The kinds of qubits best suited to simulating the behavior of a new material, for example, might be different than the best ones for optimizing a financial portfolio.

Applications of quantum computers might "need to be hardware-aware," Chow says. "If you know how your hardware performs, you can use software tricks to make optimal use of that."

Devices like those so far developed by Google, IBM and Rigetti are largely proof-of-principle machines that have now paved the way to years of hard slog to drive performance up. "At this stage, this is a game of iterative engineering improvement, not conceptual breakthroughs," says Rigetti. "There are many ways of tweaking and making incremental improvements to superconducting qubits," he adds, and the immediate future "will be an accumulation of factors-of-two improvement, through more pure materials, improvements in fabrication and lithography, better silicon wafers and so on." None of these aspects, he says, is yet really optimized or fine-tuned.

It will be slow work, and "there are no shortcuts," Chow says. But that, after all, is how the semiconductor industry has always worked — so why should it be any different for the quantum world?

Error correction

One particularly important goal is to make qubits less error prone. Throughout the era he has dubbed " noisy intermediate-scale quantum computing ," Preskill says that "it will remain important to strive for lower error rates." That way, it will be possible to construct ever larger circuits without having to worry that errors will kill the calculation.

But Rigetti says that "if you really want to unlock the true, long-term, almost incalculable potential of quantum computing, you need to get to a point where you have full fault tolerance." That's to say, where errors are simply not a worry: The qubits just work, and their error rates are comparable to those in conventional computing, which means typically one mistake in every 1018 -1020 — 1 quintillion to 100 quintillion — logic operations. "That's 20 to 30 years out," he warns.

Still, there's plenty that can be done before qubits get this good. Today's imperfect qubits can be used in bunches to make a single "ideal" — what quantum physicists call "logical" — qubit that performs a single operation in a quantum computation without error or instability. But they are discouragingly large bunches; initial estimates of how many physical qubits would be needed for a single logical qubit put the number in the tens of thousands. However, better calculations and error-correcting codes are bringing that overhead down toward something more feasible. And Chow says that there are halfway houses where manageably large groups of physical qubits could achieve lower error rates without being fully fault-tolerant.

"To me, the next five years in quantum computing is going to be marked by this pursuit of fault-tolerance and the development of error-correcting codes," Rigetti says.

Chasing coherence

It will also be important to be able to keep computations alive for longer — that is, to sustain the entanglement of qubits against decoherence. Chow points out that "deeper" computations — with more steps in the algorithm — become possible as the coherence times of the entangled qubits get extended. And there's plenty of work to be done on what he calls the "middleware": the schemes used to control the qubit states, the process known as interference, as well as dealing with compounding factors such as circuit noise.

Rigetti is optimistic that all this will yield gradual but significant advances before too long. "The system we have right now can get us out to [devices with] many hundreds, even many thousands of qubits in the foreseeable future," he says. "We think we know how to do that without a lot of breakthrough innovation."

His company and other researchers, especially those developing trapped-ion systems, are placing their bets on modular processor architectures rather like those used in today's computer industry, where relatively small quantum processors (the current generation at Rigetti has around 32 qubits per chip) are assembled into larger circuits. They hope to have modular circuits containing more than 100 qubits by early 2021.

"We think this modular architecture is going to create a clear path for scaling into the thousands of qubits," he says. "That alone gets us pretty far towards building, within five to 10 years, a 'supremacy number' [the number needed to demonstrate quantum supremacy] of logical qubits using the technology we have already".

But "none of this is easy," admits Rigetti. Quantum computing never has been.

More from Quantum computing manual