Painting a Clearer Picture of the Heart with Machine Learning

Share this post:

Coronary Artery Disease (CAD) is a condition in which plaque forms on the walls of coronary arteries, causing them to narrow. Eventually, this could lead to a heart attack, or death. This condition is now the single largest health problem in the world, with over one million people in the US undergoing cardiac catheterization – where a stent is placed in the artery to prevent blockage – each year.

To help improve the efficiency of diagnosis, clinicians are exploring new ways to measure artery blockage using virtual Fractional Flow Reserve (vFFR). vFFR involves the use of X-ray angiograms and Computational Fluid Dynamics (CFD), a method of modelling that combines mathematics and data to understand the movement of fluids and simulate blood flow in coronary arteries. This simulation completely replaces the need for a pressure wire catheter, a requirement for patients undergoing traditional FFR, meaning patients are no longer required to undergo hyperemic agent injections.

Current applications of vFFR are limited, however, as it can take from several hours to days to complete a CFD algorithm simulation. To effectively use vFFR for patients, CFD algorithms need to provide both a broader range of potential blocked arteries and the ability to compute a complete simulation in a matter of minutes, without compromising diagnostic accuracy.

In research presented at the Computing in Cardiology Conference in September 2018, our team outlined a new approach to improving vFFR simulations using high-performance computing, mathematics and data.

These simulations need to run on systems designed for machine learning and deep learning acceleration. To meet that demand, IBM researchers in Australia are using POWER9 systems, with Nvidia Tesla V100 Graphics Processing Units (GPUs), to perform hemodynamic simulations for vFFR-based diagnosis within one to two minutes. To our knowledge, this is the first application of its kind to be completed in near real time.

The speed in processing the model simulations, supported by IBM’s partnership with Nvidia, could translate into considerable savings in manual labor, infrastructure and power efficiencies for clinicians and hospitals. This also means clinicians could analyze the pressure loss caused by stenosis in CAD patients more quickly, helping ease the mental burden for patients waiting on test results.

This research is the latest step in our ongoing work to evolve how we can obtain a more accurate and complete picture of the inner workings of the heart with biophysical models and AI. Our cardiac research team has several ongoing initiatives to better understand how we can enhance heart monitoring in non-invasive ways. Recently, we published research around new ways to build and parameterize more accurate models of cardiac biomechanics, where we are able to better explore what’s happening within the heart on an anatomical and cellular level.

Within the past year, we’ve also published a study that points to the potential of combining biophysical models and machine learning to help predict and determine if a drug might lead to adverse side effects within the heart, such as cardiac arrhythmia. Ideally, one day all of these different modelling techniques may be able to be applied together to help give clinicians a clear, minimally invasive assessment of a patient’s cardiac state to help better determine treatment options.

More Healthcare stories

We’ve moved! The IBM Research blog has a new home

In an effort better integrate the IBM Research blog with the IBM Research web experience, we have migrated to a new landing page:

Continue reading

Pushing the boundaries of human-AI interaction at IUI 2021

At the 2021 virtual edition of the ACM International Conference on Intelligent User Interfaces (IUI), researchers at IBM will present five full papers, two workshop papers, and two demos.

Continue reading

From HPC Consortium’s success to National Strategic Computing Reserve

Founded in March 2020 just as the pandemic’s wave was starting to wash over the world, the Consortium has brought together 43 members with supercomputing resources. Private and public enterprises, academia, government and technology companies, many of whom are typically rivals. “It is simply unprecedented,” said Dario Gil, Senior Vice President and Director of IBM Research, one of the founding organizations. “The outcomes we’ve achieved, the lessons we’ve learned, and the next steps we have to pursue are all the result of the collective efforts of these Consortium’s community.” The next step? Creating the National Strategic Computing Reserve to help the world be better prepared for future global emergencies.

Continue reading