What is a qubit?
Explore IBM Quantum Sign up for cloud topic updates
Close-up of IBM quantum computer

Published: 28 February 2024
Contributors: Josh Schneider, Ian Smalley

What is a qubit?

A qubit, or quantum bit, is the basic unit of information used to encode data in quantum computing and can be best understood as the quantum equivalent of the traditional bit used by classical computers to encode information in binary.

The term “qubit” is attributed to American theoretical physicist Benjamin Schumacher. Qubits are generally, although not exclusively, created by manipulating and measuring quantum particles (the smallest known building blocks of the physical universe), such as photons, electrons, trapped ions, superconducting circuits, and atoms. 

Enabled by the unique properties of quantum mechanics, quantum computers use qubits to store more data than traditional bits, vastly improve cryptographic systems, and perform very advanced computations that would take thousands of years (or be impossible) for even classical supercomputers to complete.

Powered by qubits, quantum computers may soon prove pivotal in addressing many of humanity’s greatest challenges, including cancer and other medical research, climate change, machine learning and artificial intelligence (AI)

Join the IBM Quantum Network

IBM Quantum Network members collaborate widely, and benefit from close working relationships with our in-house experts.

Related content

Subscribe to the IBM Newsletter

Understanding quantum computing

Representing the next generation in computing power, quantum computing uses specialized technology—including computer hardware and algorithms that take advantage of the principles of quantum mechanics—to solve complex problems that classical computers or supercomputers can’t solve (or can’t solve quickly enough).

First proposed in the 1980s, quantum computer development has come a long way from pure theory to practical hardware applications. Today, IBM Quantum makes real quantum hardware—a tool scientists only began to imagine three decades ago—available to hundreds of thousands of developers.  

When physicists and engineers encounter difficult problems, they turn to supercomputers. However, even supercomputers are binary-code-based machines reliant on 20th-century transistor technology, and they struggle to solve highly complex problems. These classical computers are also subject to material restrictions, such as overheating, putting hard limits on their ability to process information. There are some complex problems, such as the modeling of individual atoms in a molecule, that we do not know how to solve with classical computers at any scale.

The laws of quantum mechanics dictate the order of the natural world. Computers that make calculations using the quantum states of quantum bits should, in many situations, be our best tools for understanding it and solving our most complex problems. 

When studying quantum computers, it is important to understand that quantum mechanics is not like traditional physics. Describing the behaviors of quantum particles presents a unique challenge, as most common-sense paradigms for the natural world simply lack a vocabulary to comprehend the seemingly counterintuitive behaviors of quantum particles. 

Qubits vs. bits

There are many different types of bits and qubits, but all qubits must adhere to the laws of quantum physics and be able to exist in a quantum superposition.

A classical bit can only exist in either a 0 position or a 1 position. Qubits, however, can also occupy a third state known as a superposition. A superposition represents 0, 1, and all the positions in between taken at once, for a total of three separate positions.

While qubits can encode three separate positions, they are still used to convey information through a binary system. In such systems, the term bit can refer to either the material or process used to represent a 0 or 1, or the measurement of that bit (i.e., a 0 or a 1). 

Understanding bits

In traditional or classical computing, a single bit can be thought of as a piece of binary information, notated as either a 0 or a 1. Modern computers typically represent bits as either an electrical voltage or current pulse (or by the electrical state of a flip-flop circuit).

In these systems, when there is no current flowing, the circuit can be considered to be off, and this state is represented as a 0. When current is flowing, the circuit is considered on, and this state is represented as a 1.

The term “bit” is itself a portmanteau for “binary digit,” and binary bits are the foundational basis of all computing. Whether recording a digital video, animating a 3D model or using a calculator app—all data from operating systems to software are built out of binary code, which is a collection of bits. A computer byte consists of eight bits, which is the minimum number of bits needed to convey a single textual character in binary. 

Bits can be represented electrically, by running (or not running) current through a silicon chip, for example. Bits can also be represented physically, as a hole or the absence of a hole in a piece of paper, as was used in antiquated punch-card computing. Any two-level system in which the state of the system can be described in only one of two potential positions (e.g., up or down, left or right, on or off) can be used to represent a bit. 

Understanding qubits

While quantum technologies do use binary code, the quantum data derived from a quantum system—such as a qubit—encodes data differently from traditional bits, with a few remarkable advantages. Researchers have established a variety of ways to either create qubits or use naturally occurring quantum systems as qubits. However, in nearly all instances, quantum computers require extreme refrigeration to isolate qubits and prevent interference. 

Theoretically, any two-level quantum system can be used to make a qubit. A quantum system is described as two-level when certain system properties can be measured in binary positions, such as up or down. Multi-level quantum systems can be used to create qubits, as well, as long as two aspects of that system can be effectively isolated to produce a binary measurement. Just as traditional computers can use multiple types of bits—such as electrical current, electrical charge, or holes punched (or not punched) in a piece of paper for punch-card computing—quantum computers can use multiple types of bits. Certain bits are better suited to certain functions, and an advanced quantum computer will likely use a combination of bit types to achieve different operations.

Since each bit can represent either a 0 or a 1, by pairing two bits of information, we can create up to four unique binary combinations:

  1. 0 0
  2. 0 1
  3. 1 0
  4. 1 1

While each bit can be either a 0 or a 1, a single qubit can be either a 0, a 1, or a superposition. A quantum superposition can be described as both 0 and 1, or as all the possible states between 0 and 1 because it actually represents the probability of the qubit’s state. 

On the quantum level, qubit probability is measured as a wave function. The probability amplitude of a qubit can be used to encode more than one bit of data and carry out extremely complex calculations when combined with other qubits.

When processing a complex problem, such as factoring a large prime number, traditional bits become bound up by holding large quantities of information. Quantum bits behave differently. Because qubits can hold a superposition, a quantum computer using qubits can calculate a much larger volume of data. 

As a helpful analogy for understanding bits vs. qubits, imagine you are standing in the center of a complicated maze. To escape the maze, a traditional computer would have to “brute force” the problem, trying every possible combination of paths to find the exit. This kind of computer would use bits to explore new paths and remember which ones are dead ends.

Comparatively, a quantum computer might, figuratively speaking, at once derive a bird’s-eye view of the maze, testing multiple paths simultaneously and revealing the correct solution. However, qubits do not “test multiple paths” at once. Instead, quantum computers measure the probability amplitudes of qubits to determine an outcome. As these amplitudes function like waves, they also overlap and interfere with each other. When asynchronous waves overlap, it effectively eliminates possible solutions to complex problems and the realized coherent wave or waves present the solution. 

What is quantum entanglement?

First described by Einstein as “spooky action at a distance,” quantum entanglement is a phenomenon in which two qubits (or any two or more quantum particles) intertwine in such a way that the state of one particle cannot be described independently of the state of the other, regardless of the distance between them. 

When two qubits are entangled, they both exist in a superposition until either is measured. Once observed, the quantum superposition of both is collapsed and whichever qubit isn’t observed assumes the opposite position of the one that was observed.

For example, if one half of an entangled qubit pair is measured in a 1 position, the other qubit can instantly be measured as a 0. The implications of quantum entanglement are as vast as our understanding of this phenomenon are limited. Suffice to say that traditional bits do not become entangled. In this way, entangled qubits seemingly can transfer information across even lightyears instantaneously, faster than the speed of light. While qubits do not actually transfer data faster than light, quantum entanglement can dramatically increase the power of quantum circuits. 

Different types of qubits and their advantages

As any two-level quantum system can be used to create a qubit, there are many various types of qubits currently being developed by researchers—and certain qubits are better suited to certain applications.

Superconducting

Made from superconducting materials operating at extremely low temperatures, superconducting qubits are manipulated by microwave pulses and are a favorite among quantum computer scientists for their relatively robust coherence. 

Trapped ions

Using sophisticated laser technology, trapped ion particles can also be used as qubits. Trapped ion qubits are noteworthy for long coherence times as well as high-fidelity measurements. 

Quantum dots

A quantum dot is a small semiconductor capable of capturing a single electron and using it as a qubit. Quantum dot qubits can be manipulated using magnetic fields and are particularly interesting to researchers for their potential scalability and compatibility with existing semiconductor technology. 

Photons

By setting and measuring the directional spin states of individual light particles, photon qubits can be used to send quantum information across long distances through optical fiber cables and are currently being used in quantum communication and quantum cryptography

Neutral atoms

Commonly occurring neutral atoms are defined by a balanced positive and negative charge ionic charge. Using lasers, these atoms can be charged with energy into a number of excited states, any two of which can be used to create a qubit that is well suited for scaling up and performing operations.

Qubit challenges

While powerful, qubits are also very temperamental. To function, qubits must be cooled to a temperature only a fraction of a degree higher than absolute zero, which is colder than outer space. 

Quantum particles are said to have coherence when they are sufficiently controlled to function as qubits. When a qubit loses this ability, it is described as decoherent. The high-powered refrigeration required to create a state of coherence for functional qubits is a major challenge for quantum computing. 

Even under the coldest conditions, qubit systems are also generally susceptible to failure caused by decoherence. Thankfully, advancements in the emerging field of algorithmic quantum error correction have the potential to stabilize previously tenuous quantum systems. 

Related solutions
IBM Quantum

Bringing useful quantum computing to the world. Our users access the largest quantum computing fleet in the world through Qiskit Runtime, our quantum computing service and programming model for utility.

Explore IBM Quantum

IBM Quantum Safe

IBM Quantum Safe technology is a comprehensive set of tools, capabilities, and approaches for securing your enterprise for the quantum future. Use IBM Quantum Safe technology to replace at-risk cryptography and maintain ongoing visibility and control over your entire cybersecurity posture.

Explore IBM Quantum Safe

IBM Quantum technology

Access the full suite of our leading quantum technology, from quantum supercomputing to quantum coding with Qiskit.

Explore IBM Quantum technology
Resources What is quantum computing?

Quantum computing uses specialized technology—including computer hardware and algorithms that take advantage of quantum mechanics—to solve complex problems that classical computers or supercomputers can’t solve, or can’t solve quickly enough.

What is quantum cryptography?

Quantum cryptography refers to various cybersecurity methods for encrypting and transmitting secure data based on the naturally occurring and immutable laws of quantum mechanics. While still in its early stages, quantum encryption has the potential to be far more secure than previous types of cryptographic algorithms and is even theoretically unhackable.

What is quantum-safe cryptography?

Quantum-safe cryptography secures sensitive data, access, and communications for the era of quantum computing. Quantum-safe cryptography rebuilds the cryptographic vault, making it safe against quantum and classical attacks.

What is supercomputing?

Supercomputing is a form of high-performance computing that determines or calculates using a powerful computer, a supercomputer, reducing overall time to solution.

Security in the quantum computing era

The importance of quantum-safe cryptography in the digital economy—updated with the IBM Quantum Safe roadmap.

The era of quantum utility must also be the era of responsible quantum computing

Now that we’ve entered the era of quantum utility, we are using quantum computers as computational tools to access a computational world we’ve never had access to before.

Take the next step

IBM is at the cutting edge of developing quantum computers and already offers access to 100+ qubit devices. From solving complex challenges like modeling advanced natural processes to developing quantum-safe encryption for your most secure data, discover how IBM can help secure your enterprise for the quantum future. 

Explore IBM Quantum