Building a computational sciences infrastructure

Share this post:

Editor’s note: this article is by IBM Research – Austin Director Kevin Nowka
The Texas A&M University System and IBM recently agreed to create one of the largest computational sciences infrastructures in the world, dedicated to advances in agriculture, geosciences and engineering. This new High Performance Computing (HPC) system will link 11 universities and seven state agencies, conducting such disparate projects as wind turbulence simulation, to text analytics of medical documents.
In my role as director of the director of IBM’s Austin lab, my team will develop these opportunities with Texas A&M, and solve these grand challenges.
The university system’s research compute power has always been distributed over their 11 campuses and government agencies. We want to, with three domain-optimized systems, allow for access across all of their users – and eventually expanding from engineering to geoscience and agriculture, and grow it across to all of their HPC and analytics users.
The systems behind the research
One arm of the infrastructure is 2,000 compute nodes of a Blue Gene/Q, where its modeling and simulation capabilities will be put to use in the life sciences, computational biology, and geosciences. We’re already working with Texas A&M Research to optimize their code in climate modeling, computational materials science, and even wind turbulence analysis.
The Power 7 part of the infrastructure, on the other hand, is focused on big data analytics research. It will provide Texas A&M with cognitive computing capability equal to what IBM Watson used to play Jeopardy! In this case: using IBM Big Insights for text analytics and data mining. Its text analytics capability is already working on teasing out promising technical literature around cancer treatment identification.
The third system – our NextScale System x – is the basis for much of Texas A&M’s overall HPC update. It’s adding capability such as shared compute services across their campuses and state agencies.
Other projects underway also include geoscience data management of the Gulf of Mexico, called SmartGulf, and atomic-level modeling to come up with new materials for energy, aerospace, structural and defense applications.

I’m looking forward to writing more soon about the results from this unique – and massive – infrastructure, projects, and collaboration in the coming years.

For more: read a Q&A with Jon Mogford, Vice Chancellor for Research, Texas A&M University System.

More stories

A new supercomputing-powered weather model may ready us for Exascale

In the U.S. alone, extreme weather caused some 297 deaths and $53.5 billion in economic damage in 2016. Globally, natural disasters caused $175 billion in damage. It’s essential for governments, business and people to receive advance warning of wild weather in order to minimize its impact, yet today the information we get is limited. Current […]

Continue reading

DREAM Challenge results: Can machine learning help improve accuracy in breast cancer screening?

        Breast Cancer is the most common cancer in women. It is estimated that one out of eight women will be diagnosed with breast cancer in their lifetime. The good news is that 99 percent of women whose breast cancer was detected early (stage 1 or 0) survive beyond five years after […]

Continue reading

Computational Neuroscience

New Issue of the IBM Journal of Research and Development   Understanding the brain’s dynamics is of central importance to neuroscience. Our ability to observe, model, and infer from neuroscientific data the principles and mechanisms of brain dynamics determines our ability to understand the brain’s unusual cognitive and behavioral capabilities. Our guest editors, James Kozloski, […]

Continue reading