Posted by Rami Barends and Alireza Shabani, Quantum Electronics Engineers One of the key benefits of quantum computing is that it has the potential to solve some of the most complex problems in nature, from physics to chemistry to biology. For example, when attempting to calculate protein folding, or when exploring reaction catalysts and “designer” molecules, one can look at computational challenges as optimization problems, and represent the different configurations of a molecule as an energy landscape in a quantum computer. By letting the system cool, or “anneal”, one finds the lowest energy state in the landscape – the most stable form of the molecule. Thanks to the peculiarities of quantum mechanics, the correct answer simply drops out at the end of the quantum computation. In fact, many tough problems can be dealt with this way, this combination of simplicity and generality makes it appealing. But finding the lowest energy state in a system is like being put in the Alps, and being told to find the lowest elevation – it’s easy to get stuck in a “local” valley, and not know that there is an even lower point elsewhere. Therefore, we use a different approach: We start with a very simple energy landscape – a flat meadow – and initialize the system of quantum bits (qubits) to represent the known lowest energy point, or “ground state”, in that landscape. We then begin to adjust the simple landscape towards one that represents the problem we are trying to solve – from the smooth meadow to the highly uneven terrain of the Alps. Here’s the fun part: if one evolves the landscape very slowly, the ground state of the qubits also evolves, so that they stay in the ground state of the changing system. This is called “adiabatic quantum computing”, and qubits exploit quantum tunneling to ensure they always find the lowest energy “valley” in the changing system. While this is great in theory, getting this to work in practice is challenging, as you have to set up the energy landscape using the available qubit interactions. Ideally you’d have multiple interactions going on between all of the qubits, but for a large-scale solver the requirements to accurately keep track of these interactions become enormous. Realistically, the connectivity has to be reduced, but this presents a major limitation for the computational possibilities.

In “Digitized adiabatic quantum computing with a superconducting circuit“, published in Nature, we’ve overcome this obstacle by giving quantum annealing a digital twist. With a limited connectivity between qubits you can still construct any of the desired interactions: Whether the interaction is ferromagnetic (the quantum bits prefer an aligned) or antiferromagnetic (anti-aligned orientation), or even defined along an arbitrary different direction, you can make it happen using easy to combine discrete building blocks. In this case, the blocks we use are the logic gates that we’ve been developing with our superconducting architecture.

Superconducting quantum chip with nine qubits. Each qubit (cross-shaped structures in the center) is connected to its neighbors and individually controlled. Photo credit: Julian Kelly.

The key is controllability. Qubits, like other physical objects in nature, have a resonance frequency, and can be addressed individually with short voltage and current pulses. In our architecture we can steer this frequency, much like you would tune a radio to a broadcast. We can even tune one qubit to the frequency of another one. By moving qubit frequencies to or away from each other, interactions can be turned on or off. The exchange of quantum information resembles a relay race, where the baton can be handed down when the runners meet. You can see the algorithm in action below. Any problem is encoded as local “directions” we want qubits to point to – like a weathervane pointing into the wind – and interactions, depicted here as links between the balls. We start by aligning all qubits into the same direction, and the interactions between the qubits turned off – this is the simplest ground state of the system. Next, we turn on interactions and change qubit directions to start evolving towards the energy landscape we wish to solve. The algorithmic steps are implemented with many control pulses, illustrating how the problem gets solved in a giant dance of quantum entanglement.

Top: Depiction of the problem, with the gold arrows in the blue balls representing the directions we’d like each qubit to align to, like a weathervane pointing to the wind. The thickness of the link between the balls indicates the strength of the interaction – red denotes a ferromagnetic link, and blue an antiferromagnetic link. Middle: Implementation with qubits (yellow crosses) with control pulses (red) and steering the frequency (vertical direction). Qubits turn blue when there is interaction. The qubits turn green when they are being measured. Bottom: Zoom in of the physical device, showing the corresponding nine qubits (cross-shaped).

To run the adiabatic quantum computation efficiently and design a set of test experiments we teamed up with the QUTIS group at the University of the Basque Country in Bilbao, Spain, led by Prof. E. Solano and Dr. L. Lamata, who are experts in synthesizing digital algorithms. It’s the largest digital algorithm to date, with up to nine qubits and using over one thousand logic gates. The crucial advantage for the future is that this digital implementation is fully compatible with known quantum error correction techniques, and can therefore be protected from the effects of noise. Otherwise, the noise will set a hard limit, as even the slightest amount can derail the state from following the fragile path to the solution. Since each quantum bit and interaction element can add noise to the system, some of the most important problems are well beyond reach, as they have many degrees of freedom and need a high connectivity. But with error correction, this approach becomes a general-purpose algorithm which can be scaled to an arbitrarily large quantum computer.