Quantum computing is an emergent field of computer science and engineering that harnesses the unique qualities of quantum mechanics to solve problems beyond the ability of even the most powerful classical computers.
The field of quantum computing includes a range of disciplines, including quantum hardware and quantum algorithms. While still in development, quantum technology will soon be able to solve complex problems that classical supercomputers can’t solve (or can’t solve fast enough).
By taking advantage of quantum physics, large-scale quantum computers would be able to tackle certain complex problems many times faster than modern classical machines. Quantum computers have the potential to solve certain problems in minutes or hours that would otherwise take conventional machines millennia to complete.
Quantum mechanics, the study of physics at small scales, reveals surprising fundamental natural principles. Quantum computers specifically harness these phenomena to access mathematical methods of solving problems not available with classical computing alone.
Think Newsletter
Stay up to date on the most important—and intriguing—industry trends on AI, automation, data and beyond with the Think newsletter. See the IBM Privacy Statement.
Your subscription will be delivered in English. You will find an unsubscribe link in every newsletter. You can manage your subscriptions or unsubscribe here. Refer to our IBM Privacy Statement for more information.
In practice, quantum computers are expected to be broadly useful for two types of tasks: modeling the behavior of physical systems and identifying patterns and structures in information.
Quantum mechanics is a bit like the operating system of the universe. A computer that uses quantum mechanical principles to process information has certain advantages in modeling physical systems. Therefore, quantum computing is of particular interest for chemistry and material science applications. For example, quantum computers might help researchers seeking useful molecules for pharmaceutical or engineering applications identify candidates more quickly and efficiently.
Quantum computers can also process data by using mathematical techniques not accessible to classical computers. That means they can give structure to data and help discover patterns that classical algorithms alone might miss. In practice, this capability might be useful for applications ranging from biology (for example, protein folding) to finance.
Today, much of the work of quantum computing research involves searching for algorithms and applications within these broad categories of expected use. Parallel to this search, significant work is devoted to building the technology itself.
Leading institutions like IBM, Amazon, Microsoft and Google, along with startups such as Rigetti and IonQ, continue to invest heavily in this exciting technology. As a result, quantum computing is estimated to become a USD 1.3 trillion industry by 2035.
When discussing quantum computers, it is important to understand that at the smallest scales, the universe behaves differently from what we are used to in our day-to-day lives. Compared to what we learned in grade-school physics, the behaviors of quantum objects are often bizarre and counterintuitive.
Describing the behaviors of quantum particles presents a unique challenge. Most common-sense paradigms for the natural world lack the vocabulary to communicate the surprising behaviors of quantum particles. But quantum mechanics reveals how the universe really works. Quantum computers take advantage of quantum mechanics by replacing traditional binary bit circuits with quantum particles called quantum bits, or qubits. These particles behave differently from bits, exhibiting unique properties that can be described only with quantum mechanics.
To understand quantum computing, it is important to understand four key quantum mechanics principles:
A qubit itself isn’t useful. But it can place the quantum information it holds into a state of superposition, which represents a combination of all possible configurations of the qubit. Groups of qubits in superposition can create complex, multidimensional computational spaces. Complex problems can be represented in new ways in these spaces.
When a quantum system is measured, its state collapses from a superposition of possibilities into a binary state, which can be registered as binary code—either a zero or a one.
Entanglement is the ability of qubits to correlate their state with other qubits. Entangled systems are so intrinsically linked that when quantum processors measure a single entangled qubit, they can immediately determine information about other qubits in the entangled system.
Interference is the engine of quantum computing. When qubits are placed into a state of collective superposition, they structure information in a way that resembles waves, with amplitudes associated with each outcome.
These amplitudes become the probabilities of the outcomes of a measurement of the system. These waves can build on each other when many of them peak at a particular outcome or cancel each other out when peaks and troughs interact. Amplifying a probability or canceling out others are both forms of interference.
Decoherence is the process in which a system in a quantum state collapses into a nonquantum state. It can be intentionally triggered by measuring a quantum system or by other environmental factors (sometimes these factors trigger it unintentionally). Generally speaking, quantum computing requires avoiding and minimizing decoherence.
To better understand quantum computing, consider that two surprising ideas are both true. The first is that objects that can be measured as having definite states—qubits in superposition with defined probability amplitudes—behave randomly. The second is that distant objects—in this case, entangled qubits—can still behave in ways that, though individually random, are tightly correlated.
A computation on a quantum computer works by preparing a superposition of computational states. A quantum circuit, prepared by the user, uses operations to entangle qubits and generate interference patterns, as governed by a quantum algorithm. Many possible outcomes are canceled out through interference, while others are amplified. The amplified outcomes are the solutions to the computation.
The primary difference between classical and quantum computers is that quantum computers use qubits instead of bits. While quantum computing does use binary code, qubits process information differently from classical computers. But what are qubits and where do they come from?
While classical computers rely on bits (zeros and ones) to store and process data, quantum computers process data differently by using quantum bits (qubits) in superposition.
A qubit can act like a bit, storing either a zero or a one, but it can also exist as a weighted combination of both simultaneously. When qubits are combined, their superpositions can grow exponentially in complexity.
For example, two qubits can exist in a superposition of the four possible 2‑bit strings. Similarly, three qubits can be in a superposition of the eight possible 3‑bit strings and the pattern continues as more qubits are added. With 100 qubits, the range of possibilities is astronomical.
Quantum algorithms work by manipulating information in a way inaccessible to classical computers, which can provide dramatic speed-ups for certain problems—especially when quantum computers and high-performance classical supercomputers work together.
Generally, qubits are created by manipulating and measuring systems that exhibit quantum mechanical behavior, such as superconducting circuits, photons, electrons, trapped ions and atoms.
Today, qubits for quantum computing can be created in many different ways, each better suited to particular types of tasks.
A few of the more common types of qubits in use are as follows:
Computers that use quantum bits have certain advantages over computers that use classical bits. Because qubits can hold a superposition and exhibit interference, a quantum computer that uses qubits approaches problems in ways different from classical computers.
As a helpful analogy for understanding how quantum computers use qubits to solve complicated problems, imagine you are standing in the center of a complicated maze. To escape the maze, a traditional classical computing approach would be to “brute force” the problem, trying every possible combination of paths to find the exit. This kind of computer would use bits to explore new paths and remember which ones are dead ends.
A quantum computer might derive the correct path without needing to test all the bad paths, as if it has a bird's-eye view of the maze. However, qubits don't test multiple paths at once. Instead, quantum computers measure the probability amplitudes of qubits to determine an outcome.
These amplitudes function like waves, overlapping and interfering with each other. When asynchronous waves overlap, it effectively eliminates possible solutions to complex problems, and the realized coherent wave or waves present a correct solution.
An IBM quantum processor is a wafer not much bigger than the silicon chips found in a laptop. Modern quantum hardware systems use ultracold temperatures to keep the instruments stable. Extra room‑temperature electronic components control the system and process quantum data. Together, these elements make the setup about the size of an average car.
The large footprint of a complete quantum hardware system makes most quantum computers anything but portable. Nevertheless, researchers and computer scientists can still access off‑site quantum computing capabilities through cloud computing.
While the large footprint of a complete quantum hardware system makes most quantum computers anything but portable, researchers and computer scientists are not limited by this constraint. They can still access off‑site quantum computing capabilities through cloud computing. The main hardware components of a quantum computer are as follows.
Composed of qubits laid out in various configurations to allow for communication, quantum chips—also known as the quantum data plane—act as the brain of the quantum computer.
As the core component in a quantum computer, a quantum processor contains the system’s physical qubits and the structures required to hold them in place. Quantum processing units (QPUs) include the quantum chip, control electronics and classical compute hardware required for input and output.
Your desktop computer likely uses a fan to get cold enough to work. Quantum processors need to be cold—approximately a hundred times colder than a single degree past absolute zero—to minimize noise and avoid decoherence in order to retain their quantum states. This ultralow temperature is achieved with supercooled superfluids. At these temperatures, certain materials exhibit an important quantum mechanical effect: electrons move through them without resistance. This effect makes them superconductors.
When materials become superconductors, their electrons match up, forming Cooper pairs. These pairs can carry a charge across barriers, or insulators, through a process known as quantum tunneling. Two superconductors placed on either side of an insulator form a Josephson junction, a crucial piece of quantum computing hardware.
Quantum computers use circuits with capacitors and Josephson junctions as superconducting qubits. By firing microwave photons at these qubits, we can control their behavior and get them to hold, change and read out individual units of quantum information.
Research continues improving quantum hardware components, but that’s only one half of the equation. The crux of users’ discovery of quantum advantage will be a highly performant and stable quantum software stack to enable the next generation of quantum algorithms.
In 2024, IBM introduced the first stable version of the Qiskit open source software development kit (SDK), Qiskit SDK 1.x. With over 600,000 registered users and 700 global universities that use it to develop quantum computing classes, Qiskit has become the preferred software stack for quantum computing.
But Qiskit is more than just the world’s most popular quantum development software to build and construct quantum circuits. We are redefining Qiskit to represent the full‑stack software for quantum at IBM. This redefinition includes extending the Qiskit SDK with middleware software and services to write, optimize and run programs on IBM Quantum systems—including new generative AI code‑assistance tools.
Quantum computing is built on the principles of quantum mechanics, which describe how small objects behave differently from large objects. But because quantum mechanics provides the foundational laws for our entire universe, on a small level, every system is a quantum system.
For this reason, we can say that while conventional computers are also built on top of quantum systems, they fail to take full advantage of the quantum-mechanical properties during their calculations. Quantum computers are expected to take better advantage of quantum mechanics to conduct calculations that even high-performance computers cannot.
From antiquated punch-card adders to modern supercomputers, traditional (or classical) computers essentially function in the same way. These machines generally perform calculations sequentially, storing data by using binary bits of information. Each bit represents either a 0 or 1.
When combined into binary code and manipulated by using logic operations, we can use computers to create everything from simple operating systems to the most advanced supercomputing calculations.
Quantum computers, like classical computers, are problem-solving machines. But instead of bits, quantum computing uses qubits. Qubits are used to process data like traditional bits; however, by harnessing quantum phenomena, qubits have access to more complex mathematics for a different type of computation. This principle is due to quantum mechanical concepts known as superposition and interference, which were discussed earlier.
Quantum processors do not perform mathematical equations the same way classical computers do. Unlike classical computers that must compute every step of a complicated calculation, quantum circuits made from logical qubits can process complex problems more efficiently.
While traditional computers commonly provide singular answers, probabilistic quantum machines often provide ranges of possible answers. This range might make quantum computing seem less precise than traditional computation. However, for the kinds of incredibly complex problems quantum computers might soon solve, this way of computing might potentially save hundreds of thousands of years of traditional computation.
In practice, quantum computers and classical computers work together in combined workflows to solve problems. The most efficient methods distribute the parts of a computation that quantum computers are best at to quantum resources and the parts that classical computers are best at to classical computing resources.
Fully realized quantum computers working in concert with high-performance classical computers would be far superior to classical computers alone for certain kinds of problems like integer factorization. But quantum computing is not ideal for every (or even most) problems.
For most kinds of tasks and problems, classical computers are expected to remain the best solution. But when scientists and engineers encounter certain highly complex problems, quantum computing comes into play. For these types of difficult calculations, even the most powerful classical supercomputers pale in comparison to quantum computing. Even the most powerful classical supercomputers are binary code‑based machines reliant on 20th‑century technology.
Complex problems are problems with lots of variables interacting in complicated ways. For example, modeling the behavior of individual atoms in a molecule is a complex problem because of all the different interactions between electrons. Identifying new physics in a supercollider is also a complex problem. There are some complex problems that we do not know how to solve with classical computers at any practical scale.
A classical computer might be great at difficult tasks like sorting through a large database of molecules. But it struggles to solve more complex problems, like simulating how those molecules behave.
Today, if scientists want to know how a molecule behaves, they must synthesize it and experiment with it in the real world. If they want to know how a slight tweak would impact its behavior, they usually need to synthesize the new version and run their experiment all over again. This process is an expensive, time-consuming process that impedes progress in fields as diverse as medicine and semiconductor design.
A classical supercomputer might try to simulate molecular behavior with brute force by using its many processors to explore every possible way every part of the molecule might behave. But as it moves past the simplest, most straightforward molecules available, the supercomputer stalls. No classical computer is able to handle all the possible permutations of molecular behavior by using any known methods.
Quantum algorithms take a new approach to these sorts of complex problems by creating multidimensional computational spaces in which to run algorithms that behave much like these molecules themselves. This approach turns out to be a much more efficient way of solving complex problems like chemical simulations.
One way to think about this idea: Classical computers need to crunch the numbers to figure out how a molecule will behave. A quantum computer doesn’t need to crunch the numbers. It can mimic the molecular system directly.
Quantum algorithms can also process data in ways classical computers can’t, offering new structure and insights.
Quantum computing was first theorized in the early 1980s, but it wasn’t until 1994 that mathematician Peter Shor published one of the first practical real‑world applications for a hypothetical quantum machine. Shor’s algorithm for integer factorization demonstrated how a quantum mechanical computer can potentially break the most advanced cryptography systems of the time—some of which are still used today.
Shor’s findings demonstrated a viable application for quantum systems, with dramatic implications for not just cybersecurity, but many other fields.
Engineering firms, financial institutions and global shipping companies, among others, are exploring use cases where quantum computers might solve important problems in their fields. An explosion of benefits from quantum research and development is taking shape on the horizon. As quantum hardware scales and quantum algorithms advance, we can soon find new solutions to significant, important problems like molecular simulation, energy infrastructure management and financial market modeling.
Quantum computers excel at solving certain complex problems with many variables. From the development of new drugs to advancements in semiconductor development and tackling complex energy challenges, quantum computing might hold the key to breakthroughs in several critical industries.
Quantum computers capable of simulating molecular behavior and biochemical reactions can speed up the research and development of life-saving new drugs and medical treatments.
For the same reasons quantum computers can impact medical research, they might also provide undiscovered solutions for mitigating dangerous or destructive chemical byproducts. Quantum computing can lead to improved catalysts that enable petrochemical alternatives or better processes for the carbon breakdown necessary for combating climate-threatening emissions.
As interest and investment in artificial intelligence (AI) and related fields like machine learning ramps up, researchers are pushing AI models to new extremes. They are testing the limits of our existing hardware and demanding tremendous energy consumption. There is some reason to think that quantum algorithms might be able to look at datasets in a new way, providing a speed-up for some machine learning problems.
While no longer simply theoretical, quantum computing is still under development. As scientists around the world strive to discover new techniques to improve the speed, power and efficiency of quantum machines, technology is approaching a turning point. We understand the evolution of useful quantum computing with the concepts of quantum advantage and quantum utility.
Quantum utility refers to any quantum computation that provides reliable, accurate solutions to problems that are beyond the reach of brute force classical computing quantum-machine simulators. Previously, these problems were accessible only to classical approximation methods—problem-specific approximation methods carefully crafted to use the unique structures of a specific problem. IBM first demonstrated quantum utility in 2023.
Broadly defined, the term quantum advantage describes a situation where quantum can provide a better, faster, or cheaper solution than all known classical methods. An algorithm that exhibits quantum advantage on a quantum computer should be able to deliver a significant, practical benefit beyond all known classical computing methods. IBM expects to realize the first quantum advantages by late 2026, provided the quantum and high‑performance computing communities work together.
Because quantum computing now offers a viable alternative to classical approximation for certain problems, researchers say it is a useful tool for scientific exploration, or that it has utility. Quantum utility does not constitute a claim that quantum methods have achieved an established speed-up over all known classical methods. This distinction is a key difference from the concept of quantum advantage.
IBM has introduced two metrics to benchmark quantum computers: Layer fidelity and circuit layer operations per second (CLOPS).
A valuable benchmark, layer fidelity provides a way to encapsulate the entire quantum processor’s ability to run circuits while revealing information about individual qubits, gates and crosstalk. By running the layer fidelity protocol, researchers can qualify the overall quantum device while also gaining access to granular performance and error information about individual components.
In addition to layer fidelity, IBM also defined a speed metric: Circuit layer operations per second (CLOPS). Currently, CLOPS is a measure of how quickly processors can run quantum volume circuits in series, acting as a measure of holistic system speed, incorporating quantum and classical computing.
Together, layer fidelity and CLOPS provide a new way to benchmark systems that’s more meaningful to the people trying to improve and use quantum hardware. These metrics make it easier to compare systems to one another to compare our systems to other architectures and to reflect performance gains across scales.
Circuit depth is also an essential capability of a quantum processing unit. It is a measure of the number of parallel gate executions—the number of steps in a quantum circuit—that the processing unit can run before the qubits decohere. The greater the circuit depth, the more complex circuits the computer can run.
Today, companies like IBM, Google, Microsoft, D-Wave, Rigetti Computing and more make real quantum hardware. Cutting-edge tools that were merely theoretical four decades ago are now available to hundreds of thousands of developers. Engineers are delivering ever-more-powerful superconducting quantum processors at regular intervals, alongside crucial advances in software and quantum-classical orchestration. This work drives toward the quantum computing speed and capacity necessary to change the world.
Now that the field has achieved quantum utility, researchers are hard at work to make state-of-the-art quantum computers even more useful. Researchers at IBM Quantum and elsewhere have identified some key challenges to improve upon quantum utility and potentially achieve quantum advantage:
IBM provides quantum computing technologies including Qiskit SDK and Qiskit Runtime for scalable and performance-oriented quantum computing.
Bringing useful quantum computing to the world through Qiskit Runtime and IBM Quantum Safe.
Safeguard your enterprise against post-quantum cryptography risks with IBM Quantum Safe transformation services.