Power Systems

Deep learning & HPC applications get even faster

Share this post:

Data is coming at us from every direction, and it’s up to data scientists and IT leaders to make sense of it by cleaning, processing and extracting actionable insights. It can be challenging enough to prepare your data pipeline, and once you complete this task, you don’t want hardware components to limit your ability to implement AI workloads.

Accelerated deep learning, machine learning and AI algorithms are facilitated where both CPU and GPU memory are in play. Usually, the more complex the algorithm and the larger the AI model, the more memory you need. Without the right memory footprint, you might have to limit your workloads to small data sets and low-resolution image classification. To drive an effective data science strategy, you need to tap into all of your data, including your large data sets, and both train and analyze full resolution images and videos.

Increased GPU memory positions data scientists and AI researchers to create larger models, enabling them to work with and get actionable insights from larger data sets. In short, more memory enables larger models, which eventually leads to better insights. For example, instead of analyzing compressed pixelated images, one can leverage high-resolution images with vivid detail and colors. Instead of analyzing postage stamp-sized video streams, one can apply image classification to 4k video.

Bigger models can also equate to larger data sets, which may yield unexpected discoveries. One can find interesting correlations in data by applying deep learning to data sets that might seem otherwise unrelated. Some of the most interesting prospects for AI revolve around getting answers to questions we didn’t know to ask – and that points to using bigger models with more data.

Today, we are announcing that IBM Power Systems will be adding the NVIDIA Tesla V100 32GB GPU into the POWER9-based IBM Power System AC922 server. The larger GPU memory supports larger data sets to fit in the GPU for acceleration. With the direct, high-speed NVIDIA NVLink connection between the IBM POWER9 CPU and the NVIDIA Tesla V100 GPU, we can deliver 5.6 times the data throughput compared to the PCIe Gen 3 interface found in compared x86-based servers[1].
We demonstrated this recently with both deep learning in PowerAI and machine learning in Snap ML using our large model support feature.

Visit the IBM Power Systems website to learn more about the best server for enterprise AI or click here to learn more about high-powered computing.

[1] 5.6x I/O bandwidth claim based on CUDA H2D Bandwidth Test conducted on a Xeon E5-2640 V4 +P100 vs Power9 + V100 (12 GB/s vs 68 GB/s rated)

More Power Systems stories

Innovations delivering unmatched SAP HANA scalability and availability

Power servers, Power Systems

Enterprise data is growing in leaps and bounds. Professor Dr. Hasso Plattner mentioned in his SAPPHIRE 2019 keynote that 70 petabytes of global data resides in SAP ERPs. According to IDC’s Data Age 2025 study, the annual size of global data is expected to grow 300 percent from 2019 to 2025[1]! As businesses evaluate migrating ...read more


Accelerate insights with SAS® Analytics on IBM® POWER9™

Big data & analytics, Power servers, Power Systems

IBM and SAS have been collaborating since the founding of SAS. We have helped businesses answer their most challenging analytics questions – quickly. IBM Power Systems™ and IBM Storage have been a critical part of that collaboration since the 90s. In addition to our current SAS on AIX solution, we are excited to expand into ...read more


Top IBM Power Systems myths: “IBM AIX is dead and Unix isn’t relevant in today’s market” (part 1)

IBM Systems Lab Services, Power servers, Power Systems

Given the focus on cloud, Linux, AI, blockchain and other headline-grabbing topics today — plus IBM’s recent acquisition of Red Hat — it’s important to dispel the myth that IBM AIX and Unix are no longer relevant. When a technology isn’t front-page news every day, especially if that technology is older, the tech media circuit ...read more