Deep Learning

Bringing the Power of Deep Learning to More Data Scientists

Share this post:

New AI technologies like machine learning and deep learning are fitting ever more snugly into the shifting enterprise landscape. Deep learning in particular is being adopted by an increasing number of enterprises for expanded insights and with the aim to better serving their clients. Thanks to more powerful systems and graphics processing units (GPUs), we are able to train complex AI models that enable these insights.

IBM has long been one of the leaders in analytics and over the last year or two introduced two key new products, Data Science Experience and IBM PowerAI, designed to enable enterprises to more easily start using advanced AI technologies.

Today we’re unveiling that we are bringing these two key software tools for data scientists together. We are integrating PowerAI deep learning enterprise software distribution into the Data Science Experience. With this integration, data scientists will have the tools to develop AI models with the leading open source deep learning frameworks, like TensorFlow to unlock new analytical insights.

The Data Science Experience is a collaborative workspace designed for data scientists to develop machine learning models and manage their data and trained models. PowerAI adds to it a plethora of deep learning libraries, algorithms and capabilities from popular open-source frameworks. The deep-learning frameworks sort through all types of data — sound, text or visual – to create and improve learning models on the Data Science Experience.

As an example, banks today can leverage deep learning to make more informed predictions on clients that might default on credit or to better detect credit card fraud or to offer clients other products that they are likely to value.

In manufacturing, deep learning models can be trained to identify potential failures before they happen by analyzing historical data derived from the functioning of equipment. These learning models continuously evolve and get smarter over time, and with it, become more sophisticated at identifying anomalies.

The growth of deep learning and machine learning is fueled, at least in part, by a rapid rise in computing capability via the use of accelerators like NVIDIA Tesla GPUs. We optimize the deep learning frameworks like TensorFlow in PowerAI for IBM Power Systems. For example, we take advantage of the industry’s only CPU to GPU implementation of NVIDIA NVLink high-speed interconnect, which can act as a communications superhighway of sorts.

We recently introduced the Distributed Deep Learning library in PowerAI from IBM Research that reduces[1] deep learning training times from weeks to hours. Enabling such capabilities through the Data Science Experience brings accelerated deep learning to DSX’s collaborative workspace environment.

Today’s news builds on IBM’s leadership and commitment to bringing better machine and deep learning tools to the best and brightest analytical minds, and these tools will improve rapidly over time. Join us in this journey to better insights using these advanced AI techniques.

__________________________________________________

Related:

[1] A PowerAI DDL enabled version of Torch completed 90 epochs of training on Resnet 50 for 1K classes in 50 minutes using 64 IBM Power8 S822LC servers (256 GPUs). PowerAI DLL, IBM Research

 

Vice President, HPC, AI and Analytics, IBM Systems

Dinesh Nirmal

Vice President, Analytics Development, IBM


Young Lee

Excellent news, and a good win for data scientist users!


kantam srikanth

Its sounds really awesome and proud to be an IBM’er.


Calzia Barry

Science has changed the world completely and nobody can deny it…..

Comments are closed.

More Machine Learning stories

Cracking the Internal Health Code with Deep Learning

The endoscope and colonoscope were first developed in 1880s to look inside the body. Specialists use their expertise and experience to examine the medical images. But sometimes, human error and backend issues can result in misdiagnosis. Population increase and more cases of internal diseases are overloading the medical industry in many major cities in the […]

Continue reading

Fast Data Ingestion, ML Equates to Smarter Decisions Faster

Human beings tend to filter out events they deem unimportant to avoid sensory overload. They can only process so much at any given time. Computer systems, however, must be able to handle a massive number of digital “events” – everything from changes in your car’s engine to millions of retail transactions – in real time or […]

Continue reading

Empowering the New Data Developer

After years of frustration with the trucking industry’s slow and inconsistent processes for loading and unloading cargo, Malcolm McLean in 1956 watched as his SS Ideal-X left port in New Jersey loaded with 58 of the world’s first intermodal shipping containers – a product he invented and patented. The defining feature of his container was […]

Continue reading