A quantum leap in computing
One of those new approaches is quantum computing, a system that’s based on a completely different foundation to the logic underpinning current computers. Today’s computers rely on bits, which are embodied as switches that can be set to zero or one. Quantum computing turns that simple approach on its ear. “The unit we’re historically accustomed to is the bit. In quantum computing we have the quantum bit or qubit,” says Gil. It can hold several values at the same time, a condition known as superposition.
Gil explains: “Imagine a sphere, and within this sphere is an arrow. That arrow can only be in two positions, North and South Pole. That would be a bit. Zero is the South Pole; one, the North Pole. But in a quantum bit that arrow can be anywhere pointing at any aspect of the internal surface of the sphere. You can represent a much larger number of states at the same time.
“The second aspect of that,” he continues, “is that we can exploit another principle of quantum mechanics called entanglement, what Einstein called ‘spooky action at a distance.’ And that allows us to be able to compute across all of those bits in parallel and is the basis of vast speed-up for algorithms that a quantum computer runs.”
With exponentially more power than today's fastest supercomputers, quantum computers could herald a new era throughout industries. The first working quantum computer is closer to reality.
Some scientists arrange qubits in a line, to measure only one type of quantum error at a time. This is not enough.
IBM innovations figure out a square design for a quantum circuit. With more qubits, this design can scale to working quatum system.