In New York Times I saw the article “I.B.M. Researchers Inch Toward Quantum Computer.” It was quite surprising, the quest to build a quantum computer. I thought it would be good to analysis the problem, present a view of the sate-of-art and to share it with those who are some how engaged in this domain or any related felid.
First I flashed back the thunderous days that IBM produced PCs. Then I thought how science especially physics is engaging more and more to computer science and engineering problems. I think that the name of Quantum is still somehow an idealistic concept to engineers so you can imagine how it feels to reach Quantum Computer.
The interesting matter is, this field is the province of academia, usually you can hear about this project in universities or research laboratories. Many university researchers have advanced and developed the basic scientific problems. Scientists have encouraged the company to pay attention to this project, though it’s too early to have a plan for a commercial product. Dr. Ketchen said: “It’s going to take an IBM in the end to put it together.
For me what makes sense is that IBM takes step forward into this domain. After all I think that it’s very good news to us and I guess that the future is beginning to be created or even has begun.
The biggest challenges would be how lengthening the quantum bits or shortly qubits lifetime and quickening the pace of computation. IBM researches will present their experimental results that they say that close them to solve the problem on Tuesday in Boston.
Let’s take a look at to our actual problems and challenges in computer engineering. Today the world of informatics is based on binary calculations, we describe the signal as 1s and 0s in signal transfer, using coding technics like Manchester coding in digital signals and we transfer signals in the analog form for log distance distribution using sinusoid waves.
Currently we’re bounded by hardware and electrical constraints like resistance of conductors, capacity of carriers, limitation of Signal to Noise (Shannon’s Law) and transfer of signals.
Most of the computer science experts challenge with systems and operating systems to reduce the process time. In the point of view of the system, we deal with system architecture and design, reducing dimension of circuits while having the best performance and reducing energy consumption of the system witch has an irresistible effect on system performance. In the point of view of operating systems, challenges are choosing best algorithm for managing process timing and Input/Output devices, preventing and managing the dead ends, parallelizing processes and the problem of having a well-fitted operating system for multi processor and distributed systems.
Regarding to the state-of-art, for many processes even super computers may get occupied for hours or days. Quantum computers solve problems that would occupy present-day computers for years.
"In quantum mechanics, multiple possibilities exist at once and a qubit is not necessarily a 1 or 0, but a combination of both. By stringing together qubits, a quantum computer could perform a multitude of calculations simultaneously. For certain problems like searching databases and or factoring large numbers the basis of today’s encryption technics, quantum computer could produce an answer in days or maybe even seconds, whereas the fastest conventional computer would take longer than 13.7 billion years.
The I.B.M. researchers are building quantum computer components out of electronic circuits containing superconductors, materials that carry electricity without electrical resistance. When cooled to a hundredth of a degree above absolute zero, the circuits act as qubits.
The problem is that a qubit becomes scrambled in short order, and the information it carries turns into gibberish. When physicists started experiments a little more than a decade ago, a qubit lasted only a few billionths of a second. (An alternate approach, trapping ions in electric and magnetic fields, can produce longer-lived qubits. But the superconducting circuit approach takes advantage of current computer chip technologies.)In the latest I.B.M. results, which build on a technique developed by Robert J. Schoelkopf, a physics professor at Yale, a qubit lasted as long as one-10,000th of a second.Even though that is still not long enough for perfect calculations, it is almost good enough for error correction algorithms to detect and fix any mistakes. “We’re just crossing this threshold,” Dr. Ketchen said, “which is a big morale booster that says, gee, this is becoming doable.
Below the threshold, generating reliable answers is impossible. “No matter how many qubits you had, you couldn’t even get one effectively good one because of the error rates being too high,” he said." "
We may think of the programming methods, compatibility of compilers and linkers, machine language and thousands of questions that may make our mind busy, but instead of these questions and thinking of the dark sides, I believe that it’s better to have hope, continuing researches and having hope, because the future is bright.
"Reference: New york times"