Quantum computing manual

Can quantum machine learning move beyond its own hype?

"There is a lot more work that needs to be done before claiming quantum machine learning will actually work."

Researchers at the quantum startup QC Ware

One researcher at quantum startup QC Ware tells us there's "more work that needs to be done before claiming quantum machine learning will actually work."

Photo: QC Ware

For the last decade, quantum machine learning sounded like little more than a perfect marriage of buzzwords. But right now, it seems to be having a moment.

Google's quantum computing team touts it as a "near-term application" on its website. Researchers churned out nearly four times as many academic papers on the topic in 2018 compared to 2012. In March 2019, a quantum machine learning experiment landed IBM on the cover of the scientific journal Nature. Two months later, the academic publisher Springer launched Quantum Machine Intelligence, purportedly the first academic journal dedicated to the field.

But is the fanfare really based on significant technological progress, or is it a triumph of public relations?

The goal of researchers working in the field is, broadly, to develop quantum computing algorithms that can analyze data faster, more accurately, or just differently than conventional machine learning algorithms. Regarding this goal, experts speak quite conservatively about what they've actually accomplished. The assessments range from cautious optimism to outright skepticism:

  • "I think we haven't done our homework yet. This is an extremely new scientific field," says physicist Maria Schuld of Canada-based quantum computing startup Xanadu.
  • "There is a lot more work that needs to be done before claiming quantum machine learning will actually work," says computer scientist Iordanis Kerenidis, the head of quantum algorithms at the Silicon Valley-based quantum computing startup QC Ware.
  • "I have not seen a single piece of evidence that there exists a meaningful [machine learning] task for which it would make sense to use a quantum computer and not a classical computer," says physicist Ryan Sweke of the Free University of Berlin in Germany.

Still, the community has been awash in activity. Some researchers credit an algorithm published in 2008, known as the HHL algorithm, for jump-starting the field. The HHL algorithm, named for its developers Aram Harrow, Avinatan Hassidim and Seth Lloyd, offered a significant theoretical speedup in solving a common problem in linear algebra, the field of mathematics on which classical machine learning is based. This algorithm essentially allows a quantum computer to figure out how a set of multidimensional geometrical shapes of given orientations intersect.

(Lloyd has recently been suspended from his academic position at MIT for taking more than $200,000 in donations from the late financier and sex offender Jeffrey Epstein. Lloyd was a technical adviser at Xanadu, but he no longer has links to the company, Xanadu told Protocol.)

Since 2008, more researchers have begun to develop quantum algorithms that beat classical ones. Kerenidis, for example, has come up with a quantum algorithm for solving the so-called recommendation problem faster than classical versions. If these algorithms could be run at scale, they could be of huge commercial interest: Amazon uses classical versions of this algorithm, for example, to suggest products for you to buy.

For now, however, all of the reported gains exist only on paper. Existing hardware is not sophisticated enough to execute complicated algorithms like Kerenidis' in any meaningful way. In the meantime, researchers like Schuld have begun to design algorithms to run on noisy quantum computers that already exist , rather than for a dream machine of the future.

One promising class of algorithms, Schuld says, are known as "parameterized quantum circuits," or "quantum neural networks." These are hybrid algorithms that make use of both a quantum and a classical computer. Together, the two machines try to construct a mathematical model that describes features in a data set — say, a collection of cat photos. In this example, the quantum part of the algorithm would try to guess how to model a cat's face. This guess is then fed to a classical computer, which tries to improve the model. The two machines go back and forth, improving the mathematical model iteratively. Researchers think that the quantum computer could potentially excel at identifying certain statistical patterns compared to a classical computer. (Probably not actually cat faces, though, sorry.)

Researchers design quantum neural networks specifically to comply with the quirks of a particular machine. This differs from conventional algorithms, which are designed to be compatible across various applications and devices. But quantum neural network researchers were inspired to take this back-and-forth approach when they "realized the people in the lab couldn't implement any of their theoretical ideas," Schuld says.

But researchers are still trying to figure out how to run quantum neural networks, so it's not clear when they will be useful, if at all. Their ad hoc design makes it difficult to evaluate their performance compared to other algorithms, Kerenidis says.

It's still possible that quantum machine learning will never deliver better algorithms if the hardware doesn't catch up. At Kerenidis' startup, the team is working to reduce the necessary hardware resources to run these algorithms.

But the work could offer indirect benefits. Sweke thinks that quantum algorithms could help researchers better understand classical ones. For example, despite having never been implemented on a real quantum computer, Kerenidis' recommendation algorithm inspired the development of a new classical algorithm.

In addition, quantum concepts could help illuminate how a classical machine learning algorithm learns, a poorly understood process. Sweke points to recent research from the Hebrew University of Jerusalem, which describes a deep-learning algorithm in analogy with a quantum material with many entangled particles. Some machine learning algorithms seem to mimic the behavior of quantum mechanical systems in some ways, and physicists' rich descriptions could inspire more conceptual understanding of machine learning.

Pitching these nebulous benefits, quantum researchers have had limited success recruiting the broader classical machine learning community. "We're trying to convince them slowly, but we don't have a lot of arguments yet in our hands," Schuld says. For now, their immediate task is to actually develop these arguments — beyond the existing hype.

More from Quantum computing manual