Manufacturing

Why data science experimentation drags out production optimization programs

Share this post:

Production optimization programs aim to increase production throughput and eliminate waste by leveraging data insights. But what happens when data identification – and creating analytical models relevant to use cases – becomes a job in its own right? Can out-of-the-box templates both support production optimization and reduce the need for data science experimentation? This blog explores the value of customizable, ready-to-go use case templates for manufacturing plants.

The challenges facing production optimization programs

Production optimization programs generally have two key objectives: to increase production throughput, and to eliminate waste. To achieve these objectives, we must identify, predict and pinpoint the production losses that contribute to waste and lower throughput. Here, machine learning and Artificial Intelligence (AI) have an important role to play – predicting production losses and recommending optimized action to mitigate them.

However, the task of identifying the data and creating analytical models relevant to use cases makes for an involved job – and an approach that is hard to scale.

One solution is to consider using out-of-the-box use case templates. Because many use cases in manufacturing plants are recurring, templates can be an efficient production optimization tool. Let’s look at two in particular: failure prediction and anomaly detection.

Use case #1: failure prediction

One example of an often-occurring use case is failure prediction; whereby historic data around known failures and a technique known as ‘auto-classification’ is used to predict possible faults in machines, quality or process.

Creating a template for a failure prediction use case requires:

(a)   An auto-classification analytics model pipeline – to flexibly choose the best fit algorithms based on input data

(b)   A notebook template to configure the pipeline for specific use cases

(c)    UX widgets to show the results.

With these three components, a process engineer will be able to apply the failure prediction use case template to specific machines and processes, and realize the use cases in their particular plant.

 

Figure 1. Production Optimization delivers out of the box Industry 4.0 use cases to business users on IBM technology

Use case #2: anomaly detection

The other common use case in plant floor is anomaly detection.  Anomaly detection is employed to generate early warning when one or more dependent variables are trending towards anomalous conditions that will result in a failure.  As with failure prediction, this use case can also be ‘templated’ with the following tools:

(a)   An ‘anomaly detection’ analytical model pipeline able to choose the best-fit algorithms from available input data

(b)   A notebook to configure the pipeline for a specific use case

(c)    UX widgets to show the results

Customizable templates for further use cases

In response to the need for scalable, tried-and-tested production optimization tools, IBM has templated a set of standard plant floor use cases. We can configure the templates, which are part of the Industry Solution offering, IBM Production Optimization, to apply each use case to a plant’s individual processes and assets.

In this way, they offer a swift, robust remedy to common issues, while being flexible enough to adapt to a plant’s particular infrastructure. There will be always exceptions; but we believe these templates can be used with minimum customization effort in about 70% of cases.

This approach reduces the time and effort for data science ‘experimentation’ and accelerates time to value. It also puts process engineers in driver’s seat, and allows them to interactively configure and tune the use cases. Finally, the out-of-the-box approach allows scaling up the use cases, to tens and hundreds of processes and assets within the plant floor.

Discover more about IBM Production Optimization

Take a look at our blog to discover how IBM Production Optimization can drive down equipment and process-related losses, and explore this offering in further depth on our website.

Don’t forget to look out for the next installment of our ‘Manufacturing Mondays’ series!

 

The Offering Leader and the brain behind IBM Plant Performance Analytics offering

More stories
By Lauren Longhi on November 9, 2018

The top 5 takeaways from the Industrial IoT World conference

From October 29-30, companies from across industries gathered in Atlanta, GA for the Industrial IoT World conference to accelerate their projects “from inspiration to implementation.” As a leader in the IIoT space, IBM was invited to share its expertise. I joined IBM five months ago, and was excited to see our capabilities in person. Here […]

Continue reading

By Binny Samuel on October 22, 2018

Why data science experimentation drags out production optimization programs

Production optimization programs aim to increase production throughput and eliminate waste by leveraging data insights. But what happens when data identification – and creating analytical models relevant to use cases – becomes a job in its own right? Can out-of-the-box templates both support production optimization and reduce the need for data science experimentation? This blog explores […]

Continue reading

By Chris O'Connor on October 18, 2018

There’s a new superhero in town: DATA, keeping your workers safe

Welcome to a new edition of our Get Connected series about worker safety. Today, we are going to discuss the most important aspect of your organization: your people and their ability to safely be dispatched to keep the assets in your environment running. With 178 work related accidents every 15 seconds, and 374 million non-fatal injuries […]

Continue reading