Manufacturing

What does a manufacturing plant of the future look like? (Part 1)

Share this post:

Technology is moving at a rapid pace and your options are to transform with it or fall behind. This digital transformation looks different in every industry but the underlying goals are the same: reduce costs, improve efficiency, and increase revenues. While the future is unpredictable, the performance of your plant doesn’t need to be.

As a plant manager, your primary goals are to stick to production schedules while keeping costs down. Optimizing production can be difficult when there are many variables to consider, such as power consumption, quality, pressure, temperature, vibration, etc. Results can become challenging to predict and this inevitably leads to poor performance. What can be done?

In the first part of this two-part series, we will explore the factors behind the move towards the future of manufacturing plants – cognitive. Cognitive plants can combat the challenges mentioned above and improve overall performance.

What is a cognitive plant?

Cognitive plants capture data from the plant floor and analyze it using cognitive computing and advanced machine learning algorithms. It is possible to maximize throughput, optimize quality and minimize energy costs using these insights.

The challenges manufacturing plants face today

Continuous process manufacturing plants like cement, steel, aluminum, paper and pulp are built with large capital investments and run with significant operating expenses.  Most of these processes are energy-intensive, therefore energy cost can be a considerable share of the operating expense.

The performance of the plant changes from day to day based on inconsistent raw material used, condition of the equipment and environmental factors.  This variability can be up to 40% in some plants. Most of the time, we observe inefficiencies too late and the opportunity window to make corrections is gone. This can have severe impacts on profitability.

Plants depend on skill of the operators to minimize variability in performance and maximize throughput.  However, most manufacturing processes involve complex multivariable dependency and takes considerable experience and knowledge to operate the plant efficiently through uncertain conditions.  Often, this precious knowledge is lost when experienced operators leave the plant.  Therefore knowledge retention, training and skill development is a critical aspect of a plant’s success.

Using plant-floor data to drive new insights

Data is only as valuable as the insights you can derive from it. A cognitive plant captures and records several process variables. There are three typical forms of variables:

  1. Control variables” – variables that operators can manipulate to impact the outcome of the process. For example, in a cement mill, the operator manipulates control variables such as separator speed and clinker feed rate to impact the outcome.
  2. Target variables” -variables that define the outcome of the process. For said cement mill, energy consumption and cement fineness are targets that define the outcome.
  3. Observed variables” – variables that are indicative of the process health.  Clinker temperature is an example of an observed variable in cement mills, though there will be 10s of such variables.

Several other pieces of unstructured data like operator actions, quality logs, KPI graphs etc. can be additional input into the system. This data is then analyzed to determine next steps.

Machine learning helps predict plant behavior

An operator does not program cognitive systems, but rather trains them with relevant data sets.  There are two steps in the machine learning process.  The first is the model training process, and the second is the model scoring process, where the models predict plant behavior at runtime.

There are tools available to help with this training, such as the IBM Data Science Experience. This is the tool data scientists would use for training machine learning pipelines.  The tool has a highly automated capability for feature engineering, feature selection, model selection and training.  The capabilities enable data scientists to choose the right fit for a machine learning pipeline for a given set of data. It then ingests historical plant data to have that machine learning pipeline trained.

We then deploy the trained machine learning pipeline for runtime prediction and optimization.  The pipelines first predict the target variables based on all the input data.  When there is variance in the target variable, we can trigger the the optimization function to calculate the ideal control set points.

How to begin the transformation to a cognitive plant

To begin your transformation, it is important to choose a solution that can enable this transition to cognitive. IBM Plant Advisor optimizes throughput, energy cost and quality to maximize return on invested capital (ROIC) of process manufacturing plants. It also learns the optimal conditions from past plant performance data, process data and operator actions.  It then uses the run time data from the plant machines and processes to predict inefficiencies and advises the operators of the right control set points that would optimize the production KPIs.  In doing so, Plant Advisor helps retain the collective expertise in operating the plant to maximize performance.

Additionally, we can derive insights from plant floor data to analyze process efficiency and compare similar processes from different plants; this leads to process improvements and performance improvements.

The future looks bright

Check out part two of this series where we will explore specific use cases for a cognitive plant.

Learn more about IBM’s IoT for manufacturing solutions.

View a demo of IBM Plant Advisor.

Visit the IBM Marketplace to learn more about IBM Plant Advisor.

More Manufacturing stories
By Russell Bee on September 18, 2018

3 ways blockchain will enhance your asset management efforts

In today’s digitally networked world, no single institution works in isolation. And because of this, failure of business systems and processes increasingly originates from a complex mix of social and technical factors. These include system integration, data capture, and human error or omission. In this blog, we explore how IBM Maximo Network on Blockchain can […]

Continue reading

By Chip Davis on September 10, 2018

Test management best practices: How to improve your testing efforts

With the growing number of smart and connected products being developed, more and more systems engineers must now deal with the verification and validation of embedded systems. Of course, the smarter those products get, the more complex those testing activities become. At the same time, new regulations constantly come into effect to prevent faulty products reaching […]

Continue reading

By Ray Miciek on September 4, 2018

MaximoWorld 2018: Did you remember to bring your Tupperware?

Ray Miciek, of Maximo Consultants Aquitas Solutions, shares his key takeaways from MaximoWorld 2018. MaximoWorld: The Asset Management industry’s best Pot Luck dinner With the latest MaximoWorld in the books, it’s is worth taking some time to reflect on the event. Recently, I described it as the Asset Management industry’s best Pot Luck dinner. A […]

Continue reading