Posted in: Systems, Uncategorized

A new supercomputing-powered weather model may ready us for Exascale

In the U.S. alone, extreme weather caused some 297 deaths and $53.5 billion in economic damage in 2016. Globally, natural disasters caused $175 billion in damage.

It’s essential for governments, business and people to receive advance warning of wild weather in order to minimize its impact, yet today the information we get is limited. Current global weather forecast models can predict weather patterns down to regional-scale weather events, such as snowstorms and hurricanes, but lack the ability to predict more local phenomena such as thunderstorms.

supercomputing

IBM Research, The Weather Company and NCAR aim to use supercomputing  to enable short-term thunderstorm forecasts as well as more accurate long-range forecasts days, weeks and months in advance. (Credit: Shutterstock)

This is significant considering more than 1,000 thunderstorms occur on the Earth’s surface at any moment. Until now, the industry had lacked the necessary computing power to keep an eye on the whole globe and predict when a storm will turn dangerous.

That’s why IBM Research, The Weather Company, the University Corporation for Atmospheric Research (UCAR) and the National Center for Atmospheric Research (NCAR) are collaborating to develop the first rapidly-updating, storm-scale model that can help predict weather events such as thunderstorms at local scales.  The model also aims to be the first to cover the entire globe, bringing high-resolution forecasts to underserved areas so they too can benefit from advance knowledge of potentially damaging weather patterns.

The key to the model’s ground-breaking abilities: next-generation IBM supercomputing technology and the use of graphical processing units or GPUs — a processor with thousands of small, efficient cores designed to handle multiple tasks simultaneously. Today’s weather forecasting models typically tap CPUs to run. Yet GPUs have shown promise in scaling models to both small and large areas as well as delivering forecasts for local, regional and global weather. In our collaboration we plan to develop fully GPU-enabled weather code for use in actual forecasting.

supercomputing

IBM Research will help NCAR adapt this weather model to run on next-generation IBM supercomputers with GPUs

The challenge in moving to GPUs it that all of the weather code must be modified, written and posted to the new architecture. Through IBM Research’s work with NVIDIA, Lawrence Livermore National Lab and Oak Ridge National Lab to build the GPU-accelerated “Sierra” and “Summit” supercomputers, we have a developed new technology to more tightly couple CPUs and GPUs to better support the very complex computational tasks which a weather code must execute.  We’ve also learned a great deal about the process of porting and optimizing code for GPUs that we will share with NCAR and UCAR.

Sierra and Summit will run on IBM’s next generation POWER9-based systems and are scheduled to be delivered by the end of the year. In our collaboration, NCAR will tap into a 100+ node prototype of these systems located in Poughkeepsie, N.Y. to begin the process of tuning their code to run on GPUs.

IBM is able to provide both government and commercial clients an advantage as they look to tap next-generation computing architectures as it houses both systems design and research. Because of this, we can work with clients to consider their needs some six years in the future and influence the design of the processors and systems to meet these needs.

And, within IBM Research, we are able to evaluate exciting new areas of computing such as neuromorphic — or brain-inspired — computing and quantum computing. We envision that both types may one day work in concert with our classical supercomputers to address problems that are impractical to solve on the latter alone.

Just today, IBM and the U.S. Air Force Research Laboratory (AFRL) announced they are collaborating on a first-of-a-kind brain-inspired supercomputing system powered by a 64-chip array of the IBM TrueNorth Neurosynaptic System.

The system’s advanced pattern recognition and sensory processing power will be the equivalent of 64 million neurons and 16 billion synapses, while the processor component will consume the energy equivalent of a dim light bulb – a mere 10 watts to power.

The IBM TrueNorth Neurosynaptic System can efficiently convert data (such as images, video, audio and text) from multiple, distributed sensors into symbols in real time. As a result, IBM researchers believe the brain-inspired, neural network design of TrueNorth will be more efficient for pattern recognition and integrated sensory processing than systems powered by conventional chips.

This type of collaboration will be key as we continue to push supercomputers to tackle ever more challenging problems and chase the goal of building an Exascale supercomputer, which would be capable of computing 1 million trillion floating-point operations per second.

The United States Department of Energy’s Exascale Computing Project recently announced that IBM is one of six companies being awarded a combined total of $258 million to research building the nation’s first Exascale supercomputer. Our work with NCAR and UCAR will help us get there by exposing new methodologies and capabilities to improve the performance of complex modeling and simulation codes. And as we support NCAR in their work to port their weather code to GPUs, we are also readying the code to eventually run on Exascale systems.

So as you can see, while we’re proud of our next-generation supercomputer and the breakthroughs it will fuel in weather forecasting and many other domains, we’re already looking to the future to see what barriers we can break next.

Save

Save

Save

Save

Comments

Jim Sexton

IBM Fellow and director, Data Centric Systems, IBM Research