Computation, or information processing, has become ubiquitous in our society. Everywhere you look it has an impact on our lives, from making everyday tasks such as communicating easier, to opening new avenues of exploration, and allowing us to solve problems we would never have dreamed possible. Computation, as is generally contained in your phone, or laptop, or web-server, is based on information processing that a physicist typically refers to as classical. For most of the 20th century, quantum mechanical effects in these systems have been regarded as a pain and potential nuisance. This stems from the fact that the eponymous Heisenberg uncertainty principle makes computing devices behave less reliably than classical ideals, and quantum mechanics is treated as something that introduced noise that can’t be removed.
This effect of quantum mechanical non-idealities is especially made evident as Moore’s Law continues scaling. As we reach transistor densities with characteristic dimensions at the order of an atomic layer, quantum tunneling and heating leads to incorrigible computing. So then, if Moore’s Law is fundamentally limited by quantum physics, how do we continue to push the boundaries of computing? What is the next frontier?
Today we’re laying the foundation by inviting anyone interested to create algorithms and run experiments on IBM’s quantum processor, play with individual quantum bits (qubits).
– Dario Gil, Vice President, Science and Solutions at IBM Research
Well, this is where we can in fact change the way we think about computation, and take the problem of quantum effects harming our processing, and turn it to our advantage. To do this we needed to put quantum physics back into the model of computation. This has led to the exciting field of quantum information science, and how our team at IBM Research is working with superconducting qubits towards the construction of quantum computers.
A quantum computer performs calculations using devices that follow laws of quantum mechanics. These laws allow two particles to exist in an entangled state, causing them to behave in ways that cannot be explained by classical physics. This principle along with other ideas from quantum theory led Peter Shor to show in 1995 that it’s theoretically possible to efficiently break down a very large number into its prime number factors with a quantum computer. This factoring problem is one that is believed to be hard to do with a classical computer, and is the basis of the plurality of today’s encryption systems.
Now, this breakthrough idea for the use of quantum computing was revolutionary because it revealed that one of the following must be true: we have an incomplete model of computing, factoring is easy, or quantum mechanics is wrong.
Many researchers have investigated the efficiency of the factoring problem and the validity of quantum mechanics has been demonstrated in numerous experimental demonstrations, hinting that indeed the first item is the likely truth – there is a separation between classical and quantum computing. Since Shor’s landmark idea, there have been numerous other proposed quantum algorithms that present a quantum speed-up. An up-to-date list can be found at the quantum algorithm zoo.
Another promising direction that quantum computers have a wealth of potential for application is in quantum chemistry. The field of quantum chemistry explores the physical nature underlying chemical structure in the materials all around us, those that make up life and earth. Yet, the complexity of such problems can be astounding, especially when considering complex molecules are made up of more than just a handful of atoms. Quantum computers allow a route around this complexity, because fortunately, a quantum computer works the same way that nature does, the same way the chemicals that make up life and matter work. This offers the possibility of simulating (and therefore understanding and improving upon) nature better than any conventional computer will ever be capable.
Resolving the quantum conflict between noise and controllability
This particular idea of quantum computing for simulation dates back to famed theoretical physicist Richard Feynman saying “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical. And by golly, it’s a wonderful problem, because it doesn’t look so easy.” That was back in 1981. Now, some three decades later, we are finally at the stage of building real quantum computing systems that might realize Feynman’s vision.
But, you might ask, why now? Why has it taken more than 30 years to make this “quantum leap” towards practical quantum computing? Well, it helps to understand what it means to be quantum, or entangled.
This is the grand challenge for building a quantum computer: balancing the preservation of quantum fragility (referred to as quantum coherence) with user controllability.
Quantum effects are not so easily observed in everyday life. Entanglement is non-intuitive and the world around us appears to follow classical physics. This is because in the realm of quantum physics, information becomes extremely fragile and delicate. Any miniscule disturbance, from heat, from noise, or vibration, and all the quantum effects just disappear. So it takes a lot of care to build a quantum computer, and design circuitry such that virtually no unwanted perturbation can ever touch the system – yet at the same time, functions such that we as operators of the quantum circuits can still control input and output to the individual qubits (or quantum bits) that make up a quantum processor. This is the grand challenge for building a quantum computer: balancing the preservation of quantum fragility (referred to as quantum coherence) with user controllability. Estimates of this balancing act indicate that we must reduce errors to only 1 error per 100 tera-operations (that’s 1 error per 10^14 operations) in order to build a functioning quantum computer.
IBM researchers loading up the hardware inside a dilution refrigerator that is home to the five-qubit device.
Such a minute proportion of errors is an extremely difficult challenge. Fortunately, theoretical work into quantum information processing has led to the framework of quantum error correction. Through encoding quantum information into logical qubits, it is possible to allow for higher error rates, more like 1 error per 100 to 10,000 – a massive improvement within reach of current experimental devices.
As we move to larger systems, it is important to show that using these error correction principles to show that fault-tolerant quantum computing is possible. This will be a hallmark challenge of our group and the greater quantum computing experimental community in general. Yet, along the same token of building these more complicated quantum devices with an increasing number of qubits, we will also look for near-term applications in the realm of quantum chemistry, trying novel applications of simulation that might be out of the reach of classical computers. With current numbers of qubits in our lab around 7-10, soon we will be constructing processors nearing 40-50 qubits. At that level, such devices will have enough complexity that no classical computer, anywhere, will be able to emulate them. Unearthing the quantum advantage contained within these systems, and realizing Feynman’s dream, will be within reach.
The Quantum Experience
Quantum computing’s full potential is still unknown, but we believe it reaches beyond our expectations. And until we have the hardware and a community of dedicated users, this potential will remain untapped. So, rather than limiting our calculations to classical computers, we have built the Quantum Experience. It will provide everyone a platform for investigating how to compute with the full power of nature. The time is ripe for us to build up a community of new quantum learners, and change the way we think about computing.
Founded in March 2020 just as the pandemic’s wave was starting to wash over the world, the Consortium has brought together 43 members with supercomputing resources. Private and public enterprises, academia, government and technology companies, many of whom are typically rivals. “It is simply unprecedented,” said Dario Gil, Senior Vice President and Director of IBM Research, one of the founding organizations. “The outcomes we’ve achieved, the lessons we’ve learned, and the next steps we have to pursue are all the result of the collective efforts of these Consortium’s community.”
The next step? Creating the National Strategic Computing Reserve to help the world be better prepared for future global emergencies.