August 25, 2016 | Written by: Vijay Sankaran
Categorized: Blog | Factories | Manufacturing
Share this post:
In the previous blog we focused on capabilities and benefits available to manufacturing organizations via the combination of the IoT and analytic technologies – each fundamental components of cognitive manufacturing. Intelligent, instrumented and connected equipment now enables lines of business closely associated with manufacturing processes to gain a far more detailed and accurate understanding of equipment usage and performance. As a result we see improved reliability of critical production assets. This blog focuses on process control, product quality, and yield in high volume manufacturing environments.
The basis of Statistical Process Control
Quality Control techniques emerged to the forefront of manufacturing in the latter half of the 20th century and crystallized around statistical approaches. Manufacturing engineers decomposed complex processes into a series of smaller measurable and repeatable steps, each of which focuses on adding material, or removing some, or reshaping the part using force, or changing the metallurgical/chemical/physical properties of the part. Data about the procedures followed for the step could then be collected and analyzed, with the assumption that by doing all activities precisely, the outcome will result in high accuracy of conformance with the functional requirements of the final product. Specifications called Control Limits were imposed on the data that could be collected at the beginning and end of each step. The drift of each process step could then be monitored by plotting the measurements in a time ordered sequence and applying rules for out-of-control detection. This approach, loosely called Shewart charts, forms the basis of Statistical Process Control (SPC).
The emergence of programmable automation
Fast-forward to the late 20th century to the emergence of programmable automation. Now, every step could be instrumented with sensors that measure temperature, pressure, gas flows, electric current, voltage, viscosity, and other such physical or chemical properties. Industries moved from manual charting to computer-generated charts over hundreds of process parameters. SPC systems also use automation scripts to draw attention to the most important errant steps. However, the rules to discern out-of-control events were unable to catch small drifts in the process steps. There is a better way that could be used to augment traditional SPC – using a different mathematical technique that estimates the likelihood of drifting out-of-control.
Built upon the CUSUM algorithm (in statistical quality control, the CUSUM, or cumulative sum control chart, is a sequential analysis technique typically used for monitoring change detection), this approach uses the same data as the SPC charts, but is mildly predictive by flagging situations several process cycles ahead. This algorithm is implemented within IBM’s Watson Predictive Quality solution as Quality Early Warning System (QEWS) and has generated significant reduction in scrap as well as savings in Process Engineer labor hours for our clients.The difference is in the cognitive DNA of the QEWS solution – automating CUSUM requires probabilistic reasoning that is just beyond the realm of statistical systems. For an in-depth look at CUSUM, refer to this Slideshare of lecture notes courtesy Prof. Spanos, Chair of Electrical Engineering and Computer Sciences at UC Berkley.
Identifying missing measurements
Expanding our focus to the topic of product quality, after the process engineer’s job is done, the QA engineer tests the output of the multi-step process by checking the functionality of the product and its appearance and other perceived attributes. Ultimately, Quality Control is tightly linked with Process Control, but the occurrence of defects in a well-controlled process indicates missing measurements (due to inadequate sampling or due to previously unknown process drivers) or extraneous factors. Some examples of such incidents are:
- SPC looked at film thickness of a deposited layer but the defect was caused by non-uniformity of that layer
- Temperature of an annealing furnace was being measured but the air flow pattern within the furnace caused parts stacked in the lower right quadrant to be hotter than the rest
- Dust particle or a metal shaving or a scratch on the underlying surface caused a bump in the finished product.
Satisfying the insatiable curiosity for the manufacturing process
Everything cannot be measured and monitored. That is where the manufacturing sleuth aka Yield Engineer steps in. Yield incidents were the original data science triggers in complex manufacturing facilities. Improving yield requires insatiable curiosity for the underlying physics, chemistry, metallurgy, and human factors involved in the manufacturing process. It also involves combining data from multiple sources, recognizing patterns that span space and time, looking for visual or aural cues, and most of all arriving at factually credible explanations in very limited time for what went wrong. These are all supremely human traits demanding high focus and cognitive capabilities.
Increasing quality and yield with cognitive computing
Advanced manufacturing enterprises have already excelled at numerical analysis and in some domains can even predict outcomes within highly confined bounds. We can now go further with Cognitive Computing applied to manufacturing. The knowledge embedded within manufacturing processes and their human experts can be unpacked by cognitive solutions in two ways:
- Semantics: establishing evidence based relationships between entities and observations when the manufacturing processes are in control and when they are not
- Datametrics: how to sense and measure in such a manner that can integrate with the evidence layer needed for Semantics. Such solutions will enable capture of knowledge instead of just data.
They could enable autonomous decisions for simpler tasks instead of requiring human action everywhere. In today’s world, it is the difference between a self-driving car and a highly instrumented one. These are the areas where IBM Watson can help.
A cognitive manufacturing analysis solution built using Watson could encapsulate the knowledge of an entire process and provide sophisticated data analytics tools to facilitate yield engineers in their work. By serving as a platform to collect data from underlying systems databases, manuals, journals, logs, maintenance records, and directly from sensors, the Watson IoT Predictive Quality solution can create the repository.
Using Watson Knowledge Studio the solution can collaborate with process and equipment engineers and plant floor technicians to understand the semantics of the process. Reasoning skills available through the Conversation service could be applied to iterate to eliminate lower probability causes for each event. New types of data including visual and aural can be processed through Recognition services that can classify such signals. Language Translation and Speech-to-Text Service can handle cultural contexts to grasp human factors in distinct geographies. With adequate exposure to the problem solving domain, such a solution can learn continuously from every incident to apply previously gathered knowledge and assist manufacturing engineers to solve problems faster.