Quality Management Software Updates | AlisQI Blog

When Quality Talks….. Carlos Escobar

Written by AlisQI Team | 11/21/2024

How to quick-start AI in Manufacturing

‘Welding robot c3PO has entered the chat’. Do not be surprised if you will soon have a chat with your manufacturing equipment. ChatGPT like Artificial Intelligence-tools are coming to manufacturing. Machine Learning will help you to create your own specific data models and algorithms. The concept of Manufacturing 4.0 is becoming reality. Where do we stand right now, and where are we going?  Where to start if you want to embrace AI? Carlos Escobar wrote an important book that teaches you how to quick-start AI-driven manufacturing.

5 Highlights of "How to Quick-Start AI in Manufacturing" with Carlos Escobar

  1. AI is Revolutionizing Manufacturing
    Artificial Intelligence and Machine Learning are transforming the manufacturing industry, leading to increased efficiency, quality, and innovation.
  2. Data is the Foundation
    High-quality data, both structured and unstructured, is crucial for successful AI implementation. This data can be used to train models for tasks like quality inspection and predictive maintenance.
  3. AI Solutions for Detection and Prediction
    AI can be used to detect defects immediately after production and predict potential issues further down the value chain.
  4. The Shift to Innovation Engineering
    As AI technology advances, plant engineers are evolving into innovation engineers, capable of applying AI to drive manufacturing innovation.
  5. Overcoming Challenges
    Key challenges in AI implementation include data quality, model reliability, and organizational readiness. Addressing these challenges requires a strategic approach and a focus on continuous learning and adaptation.

 

Industry Veteran Carlos Escobar

In various roles, Carlos Escobar has a ton of experience in applying AI (Artificial Intelligence) and ML (Machine Learning) in the manufacturing and supply chain context. He had positions within General Motors, Amazon, Harvard, he obtained a PhD in Business engineering with a focus on AI and recently, Carlos published a book named Machine Learning in Manufacturing, Quality 4.0 and the Zero Defects Vision.


Which is a must-read for leadership and management that, from a strategic and development perspective, wants to learn the state-of-the-art in machine learning in the context of manufacturing and quality management. You will get an overview of the technologies and methodologies that we as quality professionals will soon use day-to-day.  The book also presents interesting use cases and projects in manufacturing. 

The foundation of all things AI is data. In his book, Escobar formulates practical binary classifications.

The data: structured versus unstructured

Escobar breaks his experience and thinking down into projects using structured data versus unstructured data. Unstructured data in manufacturing are mainly images. For instance, quality inspection, in which you typically have humans doing some measurements, evaluating the quality and the fitness of the process to make sure it complies with predefined quality. These visual inspections can easily be replaced by AI. The machine learning part is done by feeding the decisions of humans to the system. The conclusional neural networks have the ability to learn from that and to see the patterns. 

Then you have structured data or tabular data. We are very familiar with statistical models. For instance, we monitor pressure or temperature for basic process measurement in order to tweak and adjust the production process. AI will help predict what is going to happen in the execution of the process. For instance, in a welding process, you generate a lot of information signals. These signals are data that can run through a neural network to determine if the welding process was successfully executed.

The AI solutions: detection versus prediction

Escobar distinguishes AI in manufacturing projects also into two categories. The first one is detection. Which means immediately after the process is executed, detect those items that were not successfully produced. The second category is prediction, basically to use data early in your process for the algorithm to learn and develop the ability to predict some potential non-conformance further downstream in the value-added process. 

The examples in Escobar’s book are real world projects that are implemented at the shop floor. Most of these customized applications are developed into solutions that are abstracting from this specific use case to become more generic and applicable in other use cases across the industry.

From specific to generic

The overall process of formulating and addressing the problem, applying the algorithms and solving the problem will basically turn out to be pretty generic. But you need to study the particular problem and solve it first before you can generalize. So that's the reason most of these applications are developed at R&D level, in collaboration with plant engineers who actually have the domain knowledge. But as the technology is moving forward and becoming more generally available, Escobar foresees that Plant Engineers become Innovation Engineers themselves, as they learn to apply these technologies to drive manufacturing innovation. 

A problem may be that the characteristics of the industry could hamper the development of generic AI-solutions. Production devices and processes may differ between industries, or even between companies in the same industry. Some companies offer products that they only produce once or twice a year. How to apply AI/ML in such an environment? How do you know that you have enough data that lead to accurate predictions? If you have just one or two historical measurements for each year, it will be better to study the entire chain and identify where and how it deviates to find possible approaches to a solution. Also, you can go a bit more upstream in the process and determine a point where maybe the raw materials or intermediate products are branched off into various and different items. There may be a bigger volume of data in the earlier steps upstream in the production process that can be used to validate the outcome of the final product.

Speeding up

Gartner's hype cycle sees Artificial Intelligence within manufacturing in five to ten years, moving from the current phase of Inflated Expectations to the Plateau of Productivity of general acceptance and implementation. Escobar thinks it will go faster, more in the five to seven years’ timeframe. This acceleration is specifically caused by Generative AI (GenAI) that will make problem-solving much more efficient. GenAI accelerates the whole process of writing queries, preprocessing the data for training, addressing research and methodological questions. Writing software code is reduced from days to seconds. Escobar expects that GenAI will even be able to provide synthetic data for training models when the volume of real-life data is too little!

Buy or Build?

As a mid-sized manufacturer, do you need those AI/ML skills in house, or do you better rely on external suppliers? Is it a make or buy situation?  If you want to build in-house expertise, you will need a couple of experts. For small to medium size companies, this is hardly economically efficient. According to Escobar, one AI expert might not be enough to drive manufacturing innovation. But this person is needed as a liaison between the manufacturing problem and the outside expertise you're going to bring in. So, you need at least one person to make sure that the right questions are asked to the right service provider or vendor.

Pitfalls

Many manufacturers are in doubt where to start. They try to identify high-value, high-feasibility use cases. One of the pitfalls that Escobar describes in his book is that most of the selection criteria of projects are based on just the expected business value, without evaluating the feasibility of solving the problem. Another pitfall is working along the typical, two-dimensional framework of four quadrants: High/Low Value vs High/Low Feasibility. Escobar warns that these kinds of tools are not enough for selecting the winning project. To begin with, he believes that you need to have a portfolio in which you must analyze multiple projects and see how each of these projects align with your strategy. In his book, he has defined 18 questions that help ranking the projects. From basically ‘Is the data available or not?’, to ‘Does the data include all the sources that are needed to solve the problem?’. This results in a weighted matrix that will guide in selecting the winning projects and prioritize the portfolio.

Required Data

What data exactly is needed? Is there a minimum volume needed, and how do you handle poor data quality? What to do when there is not enough data available?

Escobar states that the data needs to contain the patterns, and it must contain the features that are mentioned in his binary classification Structured/Unstructured vs. Detection/Prediction. Also, the variables of the data need to show discriminatory information. Most importantly, the data must make sense in a physical or engineering context. Marketing or financial data have no use. And the data ranges you need might be in the order of at least hundreds. Compared to consumer industries where you see many millions of data points, that may seem to be low, but in R&D you usually will have to deal with much smaller volumes. As long as you have patterns, you are able to start the initial analysis. That brings you on the right track for moving forward to pilot runs to generate a little more data in a realistic environment. And eventually for the beta phase of the project. 

In due time, GenAI might come to the rescue for projects where little data is available. Escobar thinks that in the near future we will be able to artificially generate useful data. And he envisions that eventually we can train models that will generate the data that is needed to solve the problem even without having real data from the physical production system.

Building Trust

Generative AI creates various responses that are not always consistent. In manufacturing, you need to have consistency and reliability. From that point of view, the application of GenAI may introduce additional risks. Manufacturing systems are non-stationary systems, so we need to address that challenge. Escobar explains that AI is an ongoing, iterative process. AI solutions are not forever. They are data driven. So, when a process is exposed to new temporary sources of variation, the quality of your AI predictions will decay. Therefore, you need to develop and maintain a learning and relearning ML scheme.

Low-hanging fruit

The implementation of solutions that are able to replace the human base, offers a quick win. For instance, leverage AI for visual inspections and human-based monitoring systems. Those solutions already have shown 80 percent reliability. That is higher than human reliability because humans get tired and distracted, while doing these mortifying tasks.

Egg or chicken

In his book, Escobar also presents an overview of the evolution of the industry, from the Industrial Revolution, to where we are today in Industry 4.0 with the integration of the physical and digital domains, and Digital Twins. At the moment, though, there are still many manufacturers that depend on Excel or even paper-based systems. For those, there seems to be an egg-chicken problem. Start with AI and move to Industry 4.0? Or reach a certain degree of maturity in Industry 4.0 and then start with the implementation of AI?  What are the prerequisites before starting to look at AI?

The most basic prerequisite is digitalization. Identify the most critical operations, the ones in which you see that you have data, and that are generating significant non-conformances or shortcomings. As soon as possible, you digitalize those operations and get the data in the format that you need. In the meantime, you start building your data science team or assign third party expertise.