I first learned about IBM’s TrueNorth technology starting back in 2014, but it wasn’t until colleagues and I participated in a TrueNorth application development bootcamp last year that I understood its true potential. I am optimistic that this new neurosynaptic computing architecture, which is modeled on the way the brain works, could play an important role in the future of high performance computing as applied to cognitive computing, machine learning, and artificial intelligence.
The TrueNorth system
At the bootcamp, my team mixed with scientists from IBM, other companies, universities, and other federal research organizations. We all got our hands dirty building applications to run on TrueNorth. I’m an engineer at heart, so it was exciting to be able to write code, compile it and run it on a radically new computing system—all in just a couple of weeks.
Now my lab mates and I are really stoked. We bought a complete TrueNorth system, combining hardware, software and development tools, to experiment with. It’s a first-of-a-kind computer: 16 TrueNorth processors integrated together on a single circuit board. The system possesses the equivalent of 16 million neurons and 4 billion synapses, but consumes the energy equivalent of a tablet computer—a mere 2.5 watts of power.
TrueNorth was originally developed with funding from the Defense Advanced Research Projects Agency, so, in a sense, we’re taking a handoff from our sister agency.
The TrueNorth system has a number of important qualities, but none is more significant than its power efficiency. At the national labs, we’re running into the limits of classical physics in our high-performance computing systems. Today’s mainstream computers are build on the foundation of the Von Neumann computing architecture and the conventional, late 20th century integrated circuit. Yet, because of the laws of physics, the computer industry is no longer able to produce the performance gains that it delivered in past years. At the same time, the amount of energy consumed by today’s computer systems will be unaffordable and unsustainable in the coming years when we expect to run exascale computers—capable of executing a billion billion calculations per second.
It’s difficult to produce a detailed comparison of TrueNorth’s performance to that of today’s mainstream computers, but I expect that systems based on this architecture will be an order of magnitude more efficient than conventional computers for pattern recognition tasks.
It’s also possible that the architecture will produce superior results when performing certain computing tasks, including pattern recognition and deep machine learning. That’s what we aim to find out in our experiments.
In one experiment, an image analysis application, we’ll test the ability of the TrueNorth system to identify, count and keep track of vehicles captured in overhead aerial photographs. It’s a pattern recognition challenge. In a second experiment, we’ll test the system’s ability to keep track of objects that are described in a sequence of declarative statements involving a person’s activities. You can picture how these capabilities might be used to safeguard our national security.
I foresee neurosynaptic chips playing two key roles in the high-performance computing world of the future. First, because the chips are designed to be integrated with one another, and, essentially, operate like one large processor, computer scientists will be able to build massive computers by adding more processors—enabling them to take on very large computing tasks. Second, I believe that many of the high-performance systems of the future will possess a variety of computing capabilities so they can take on complex tasks. A neurosynaptic system focusing on pattern recognition and deep learning could be a component of a larger system on the road to exascale supercomputers.
Working with the TrueNorth technology is especially intriguing to me because my dad, David Van Essen, is a professor of neuroscience at Washington University in St. Louis. I have watched from the sidelines as he and his academic brethren made tremendous advances in understanding how the human brain works. Now the ball is in my court. I have the enviable task of helping to replicate in my own field the awesome success of the brain. That’s the kind of challenge that makes me proud to be a scientist.
[…] morning, the fabled Lawrence Livermore National Laboratory announced it has purchased a supercomputing platform “for deep learning inference” that was developed by IBM Research. The platform can […]
I think really its not such a new direction but in fact, going back to the origin..
HPC at IBM has started out of DeepThought the father of BlueGene. It was a machine that won chess matches against a human. Garry Kasparov. Hardly thought at the time or called as “cognitive computer” but the exhaustive searches and other techniques used were what the technology could achieve at the time.
And jump to the presence, we have Watson, the Jeopardy!game winner, Cognitive Computing, AI and … TrueNorth chips! Looks to me a perfect continuation with today’s technology and innovations to match!
With 256M synapses and a ~2KHz repetition rate the equivalent operation rate of the chip is 1Top/s. Current GPUs
have performance 3x higher at 3Tflop/s. This assumes
100% utilization of synapses, binary weights and competitive
accuracy, all assumptions very generous to TrueNorth.
Data used to be viewed by businesses as exhaust — a necessary but relatively useless output of projects that became a nuisance to store and maintain. Advances in analytical, and more recently cognitive, technology have changed that mindset for good. Data is now an asset that can be applied to benefit all aspects of an […]
Announced at the 2017 Symposia on VLSI Technology and Circuits conference in Kyoto this week, IBM and our research alliance partners, GLOBALFOUNDRIES and Samsung built a new type of transistor for chips at the 5 nanometer (nm) node. To achieve this feat, the architecture – how the elements of a chip are arranged and the […]
The goal of data visualization is to help explain complex data-driven trends, patterns, and correlations quickly and easily through imagery. And while traditional methods of data visualization have focused on interactive graphics, an exciting new trend is emerging that captures the movement of data in video. In my work at IBM’s Cognitive Visualization Lab, I […]