I first learned about IBM’s TrueNorth technology starting back in 2014, but it wasn’t until colleagues and I participated in a TrueNorth application development bootcamp last year that I understood its true potential. I am optimistic that this new neurosynaptic computing architecture, which is modeled on the way the brain works, could play an important role in the future of high performance computing as applied to cognitive computing, machine learning, and artificial intelligence.
The TrueNorth system
At the bootcamp, my team mixed with scientists from IBM, other companies, universities, and other federal research organizations. We all got our hands dirty building applications to run on TrueNorth. I’m an engineer at heart, so it was exciting to be able to write code, compile it and run it on a radically new computing system—all in just a couple of weeks.
Now my lab mates and I are really stoked. We bought a complete TrueNorth system, combining hardware, software and development tools, to experiment with. It’s a first-of-a-kind computer: 16 TrueNorth processors integrated together on a single circuit board. The system possesses the equivalent of 16 million neurons and 4 billion synapses, but consumes the energy equivalent of a tablet computer—a mere 2.5 watts of power.
TrueNorth was originally developed with funding from the Defense Advanced Research Projects Agency, so, in a sense, we’re taking a handoff from our sister agency.
The TrueNorth system has a number of important qualities, but none is more significant than its power efficiency. At the national labs, we’re running into the limits of classical physics in our high-performance computing systems. Today’s mainstream computers are build on the foundation of the Von Neumann computing architecture and the conventional, late 20th century integrated circuit. Yet, because of the laws of physics, the computer industry is no longer able to produce the performance gains that it delivered in past years. At the same time, the amount of energy consumed by today’s computer systems will be unaffordable and unsustainable in the coming years when we expect to run exascale computers—capable of executing a billion billion calculations per second.
It’s difficult to produce a detailed comparison of TrueNorth’s performance to that of today’s mainstream computers, but I expect that systems based on this architecture will be an order of magnitude more efficient than conventional computers for pattern recognition tasks.
It’s also possible that the architecture will produce superior results when performing certain computing tasks, including pattern recognition and deep machine learning. That’s what we aim to find out in our experiments.
In one experiment, an image analysis application, we’ll test the ability of the TrueNorth system to identify, count and keep track of vehicles captured in overhead aerial photographs. It’s a pattern recognition challenge. In a second experiment, we’ll test the system’s ability to keep track of objects that are described in a sequence of declarative statements involving a person’s activities. You can picture how these capabilities might be used to safeguard our national security.
I foresee neurosynaptic chips playing two key roles in the high-performance computing world of the future. First, because the chips are designed to be integrated with one another, and, essentially, operate like one large processor, computer scientists will be able to build massive computers by adding more processors—enabling them to take on very large computing tasks. Second, I believe that many of the high-performance systems of the future will possess a variety of computing capabilities so they can take on complex tasks. A neurosynaptic system focusing on pattern recognition and deep learning could be a component of a larger system on the road to exascale supercomputers.
Working with the TrueNorth technology is especially intriguing to me because my dad, David Van Essen, is a professor of neuroscience at Washington University in St. Louis. I have watched from the sidelines as he and his academic brethren made tremendous advances in understanding how the human brain works. Now the ball is in my court. I have the enviable task of helping to replicate in my own field the awesome success of the brain. That’s the kind of challenge that makes me proud to be a scientist.
Chronic kidney disease (CKD) is one of the most severe secondary complications related to diabetes. It is characterized by the progressive loss of the kidney function, beginning with a decline in the glomerular filtration rate and/or albuminuria, eventually resulting in end-stage renal disease. This severe complication often requires dialysis or renal transplant therapy. In January […]
In this era of swelling data, the mining of insights to predict future outcomes with greater accuracy, to automate tasks, and to recommend actions based on that data is growing increasingly critical for organizations and businesses of all sizes. Such is the role of the data scientist, the profession the Harvard Business Review dubbed the […]
As the annual Supercomputing conference celebrates its 30th birthday in Dallas this week, I’m reminded how far supercomputing has come, and how exciting the HPC industry is right now. With the Big Data boom, the immense amount of information represents tremendous opportunity for researchers who have new fuel for their projects. But it also provides […]