Last week, IBM Fellow Charles Bennett stood on stage and organized a group photo of more than 150 IBMers and leading scientists from around the world who had gathered at the T.J. Watson Research Laboratory near New York City for a conference on quantum computing.
For those in the know, it was a delicious moment. Nearly 35 years ago, Charlie had been the self-appointed official photographer at what turned out to be the first-ever quantum computing conference, which was hosted jointly by IBM and MIT. At that conference, held at the MIT Endicott House, Dedham, Mass., famed Nobel Laureate Richard Feynman issued a challenge that was heard around the world—urging scientists to develop a new breed of computers based on quantum physics.
Ever since then, computer scientists have been grappling with the difficulty of achieving such a grand challenge, so the MIT event went down in history as the birthplace of quantum computing.
It won’t be known for years whether last week’s gathering at IBM will be anywhere near as influential, but one thing is clear already: The event signaled that quantum computing is at a turning point—beginning a transition from theory and experimentation to engineering and applications.
One key piece of evidence: In addition to computer scientists and graduate students, there were representatives from industries, including venture capital, investment banking, aerospace, and scientific equipment.
“We are in a golden age for quantum computing,” says Dario Gil, vice president, science and technology, at IBM Research. “Academics, industry, investors and government leaders are all coming together. It will take a broad ecosystem to make this grand challenge a reality.”
Now that quantum computing is entering the realm of the practical, it’s incumbent on business and government leaders to understand its potential, for universities to beef up their teaching programs in quantum computing, and for students to become aware of promising new career paths. “We’re all becoming quantum engineers,” says Jerry Chow, one of the leaders of IBM’s quantum research team. (Read an IBM Center for Applied Insights report on quantum computing here.)
In recent years, scientific advances in quantum computing have been coming with increasing frequency. More than 8,000 articles on the topic were published in academic journals last year alone, and many of them came from engineering professors rather than information theorists. At the same time, opinion in the scientific community has been coalescing around a handful of approaches that are considered most promising.
For decades, it seemed that the dream of building a universal quantum computer was always 20 years off–over the horizon of predictability. Now, leaders in the field talk about the breakthrough coming in 15 or even 10 years. “The field is evolving in an exciting way,” says Scott Aaronson, an MIT professor who presented at the IBM conference and who doesn’t make predictions about the future. For him, the next big step will be to demonstrate “quantum supremacy”—proof that quantum computers can outperform conventional computers by a wide margin.
Quantum computing works fundamentally differently from today’s computing. A traditional computer makes use of bits, where each bit represents either a one or a zero. In contrast, a quantum bit, or qubit, can represent a one, a zero, or both at once. Therefore, two qubits can be in the states 00, 01, 10 and 11 at the same time. For each added qubit, the total number of potential states doubles. Hence, the use of qubits could enable us to perform calculations exponentially faster than is possible with traditional computers.
Computer scientists have long targeted encryption and decryption as important uses for quantum computation, but other potential applications are now emerging as well. They include simulation of natural phenomenon, including of molecules for new materials and drugs; accelerating machine learning algorithms and other linear algebra problems at the core of cognitive computing and engineering design; and speeding unstructured search to help deal with the vast flood of data from the Internet of Things.
The representatives of industry who attended last week’s conference described themselves as advance scouts for their organizations. Steve Chappell, technology director for Oxford Instruments in the UK, says, “This is the first massive research field I have seen that has run so fast from science to systems.”
He and Mark Gibbons, a technology architect for J.P. Morgan Chase’s investment banking division, say they’re watching and learning for now—no plans to buy early quantum computers or to build technologies that enable or use the science. “You want to engage with the community and be part of the discussion,” says Gibbons. “That’s why we’re here.”
A venture capitalist in the room, Andrew Schoen of New Enterprise Associates, is a little further along in the commitment department. He and one of his partners are currently evaluating a potential investment in a quantum computing startup.
Schoen believes that because the science is so difficult and systems design is so challenging, that most of the early startups will be spin-outs from universities, and, that, eventually, many of them will be acquired by larger tech companies with the ability to integrate complex computing systems and sell them to large organizations. He expects other venture capitalists to start poking around in quantum computing soon. “The VCs usually show up when science begins to make the transition into engineering,” he says.
The most meaningful realm of today’s research—in academia and industry alike– is focused on building a universal quantum computer, which can be programmed to perform any computing task. The major challenges include creating qubits of high quality and packaging them together in a scalable form so they can perform complex calculations in a controllable way—limiting the errors that can result from heat and electromagnetic radiation.
The research team at IBM is pursuing an approach to building a universal quantum computer using superconducting qubits. They’re combining qubits in lattices on computer chips, cooling them to near absolute zero, and using a technique called surface code to protect the fragile quantum information while performing calculations. Every few months, they produce new devices containing more qubits. The next stage is to correct quantum errors, an important step toward building a large quantum computer.
The effort of overcoming challenges of encoding qubits in ways that protect against errors will be greatly accelerated under a new research grant that was just awarded to the IBM team by the US Intelligence Advanced Research Projects Agency (subject to completion of negotiations.)
Some other tech companies and researchers are focusing on a different approach called quantum annealing. These machines are designed and built for specific uses. So far, it’s unproven that quantum annealing produces better results than could be achieved using conventional computers.
The most notable of such efforts belongs to a small Canadian technology, D-Wave Systems, which has sold systems to NASA, Google and Lockheed Martin. While D-Wave created a splash in the media with its claims to have built the first quantum computer, the scientific community is skeptical. “It’s not a quantum computer,” says Isaac Chuang, a professor at MIT who presented at IBM’s conference. “That’s not to denigrate it, because it could prove to be useful. But it’s something different than quantum computing.”
While rapid progress is now being made in quantum computing, many challenges have yet to be overcome and many questions still must be answered.
Charlie Bennett reflects back on the conference at MIT in 1981. “That one was so preliminary. We had a wide array of people, from noble laureates to complete nuts,” he says. “Quantum computing is now a mature field, scientifically, but we don’t know where it’s going—how it’s going to impact the world.”
That’s the challenge that this generation of scientists and engineers will get to figure out.
Chronic kidney disease (CKD) is one of the most severe secondary complications related to diabetes. It is characterized by the progressive loss of the kidney function, beginning with a decline in the glomerular filtration rate and/or albuminuria, eventually resulting in end-stage renal disease. This severe complication often requires dialysis or renal transplant therapy. In January […]
In this era of swelling data, the mining of insights to predict future outcomes with greater accuracy, to automate tasks, and to recommend actions based on that data is growing increasingly critical for organizations and businesses of all sizes. Such is the role of the data scientist, the profession the Harvard Business Review dubbed the […]
As the annual Supercomputing conference celebrates its 30th birthday in Dallas this week, I’m reminded how far supercomputing has come, and how exciting the HPC industry is right now. With the Big Data boom, the immense amount of information represents tremendous opportunity for researchers who have new fuel for their projects. But it also provides […]