Keywords

1 Introduction

Competitiveness in the composite production market is dependent on robust processes to manufacture components of consistent quality in short cycle times. The required process understanding is currently retained as operator experience and other domain expert knowledge. Such expertise is typically insular, focused on individual steps along the production chain from raw material to finished part and not trivially available to inform data driven analysis [1, 2]. This hinders the discovery of latent variables and their complex interactions along the process chain. Data-driven business models such as predictive maintenance, remote support services for production equipment and advanced part quality assurance approaches require a more stringent collection, storage and evaluation of process information [3,4,5].

1.1 Composite Manufacturing by RTM

High pressure resin transfer molding (HP-RTM) is a method of combining fibers and polymers into fiber reinforced polymer composites [6]. As shown in Fig. 1, the process begins with cutting and assembling of a textile reinforcement into a stack. This material is then draped close to the final part shape. During this step, the stack is also consolidated, which stabilizes it for further handling. After shaping the stack, the preform is transferred to the RTM tool. The resin is injected under controlled pressure and temperature in this step, which polymerizes to the desired composite matrix material. Once this curing is complete, the part is removed from the tool and machined, typically milled, to meet the target dimensions.

Fig. 1
A R T M process for composite production. It starts with cutting and stacking textile reinforcement, draping and consolidation, net shaping, molding, and ends with post-processing.

RTM process chain for composite production, consisting of (1) textile raw material from fibers, (2) cutting and stacking of textile reinforcement, (3) draping and consolidation, (4) net shaping of the preform, (5) transfer of preform into the mold, (6) infiltration and polymerization of the polymer matrix and (7) post processing of the composite part

The RTM process is affected by multiple factors such as effective fiber volume content, textile permeability, resin viscosity, tool gap, pressure, etc. In a classical approach, each influencing variable is controlled for by fine tuning the process during tool commissioning and maintaining a conservative safety margin in all process parameters where possible.

Similarly, in the final post-processing step of milling the composites, tool wear is a critical factor. Tool damage accumulates over time and tool life is typically assumed conservatively to avoid reduced milling quality.

1.2 Knowledge Extraction in a Complex Network of Cyber-Physical Systems

Bridging the steps of composite production to create a unified, digital representation of the part production requires knowledge extraction at each step of the process.

The gain of non-trivial information from a complex network of digitized and connected assets along the value stream of a producing organization requires experts in the specific domain, IT, data science and management. The underlying reference process is widely studied in the literature [7,8,9] and the following steps were incorporated in the project:

  1. 1.

    Business understanding

  2. 2.

    Data understanding

  3. 3.

    Data preparation

  4. 4.

    Data modelling

  5. 5.

    Evaluation

  6. 6.

    Deployment

2 Implemented Approach

2.1 Data Management and Analysis

To allow a meaningful discussion of process data, three required stages of activities can be formulated [10]:

  1. 1.

    Identification of data sources, boundary conditions & use-cases

  2. 2.

    Definition of data transfer, storage and analysis methods

  3. 3.

    Implementation of the software and hardware modules to the manufacturing equipment

A successful implementation of these steps allows for the derivation of data analysis models for a process, in this case HP-RTM and milling.

As part of the first step, data was grouped by process step (RTM, milling) and origin (tool, press, RTM equipment, milling fixture, other). A list of possible faults in product or process with relevant equipment components and their data sources was generated in exchange with the respective domain experts for all process steps covered. Identifying these use-cases allowed for a derivation of requirements on the data collection modules. These cover the RTM process from infiltration to post processing (see Fig. 1, steps 3–7).

Stage one showed disparate requirements between the process steps RTM and milling, leading to the implementation of two tailored data acquisition solutions. In the case of RTM, a centralized data merging node must communicate with the tool, press and RTM equipment, which in turn were enabled to share their sensor data. Additional sensors were included where use-cases suggested a probable benefit. In the case of milling, an equipment-borne sensor was combined with offline measurements of tool wear.

To enable unique identifiability of experiments across both process steps, bar code identifiers were embedded into the parts. This consequently enabled merging of both the data sets into the project database for an integrated analysis of activities (Refer to Fig. 2 for a high-level overview of the resulting data acquisition structure).

Fig. 2
A schematic view of a data acquisition structure. It includes part I D, press, R T M tools, and continues with project database, part processing, and then ends with part molding.

Schematic view of the developed data acquisition structure. During part molding information from RTM equipment, tooling and press is captured together with quality control data. In part processing milling and quality control data is captured and merged. All data is referenced to individual part identifiers and saved in a common data store

2.2 Process Monitoring and Predictive Maintenance for serial HP-RTM Production

Process monitoring and maintenance support activities were identified as desired use-cases for RTM. This informed the tool design, incorporating technological features benefitting most from these solutions. This includes complex sealing concepts and tool functionalities. The size and setup of the experimental campaigns was chosen to approach serial production conditions as closely as possible within the constraints of a research project. For this purpose, the material handling was automated once it was defined, implemented and tested.

2.3 Process Monitoring and Quality Assurance in Post-Processing

Process monitoring and quality assurance (QA) related use-cases were identified for post-processing (milling). Variability in milling tool design required a large number of experiments to allow for a comprehensive testing plan to be developed but also necessitated the development of a customized milling fixture.

Typically, milling operations in composite manufacture by RTM can be parallelized cost effectively—in contrast to the infusion step. The availability of a single research fixture thus limited the size of experimental campaigns for the project.

3 Preliminary Results

As of the writing of this paper, two experimental campaigns have been conducted, encompassing a total of 991 individually identifiable parts. Both data analysis and part processing activities are still ongoing. This section is, therefore, intended to showcase one exemplary finding, enabled by the approach developed. The first use case was prioritized using a criticality-centered approach that primarily focused quality and availability losses. Decisive for the importance of the respective losses was the impact and frequency in the production domain. In the first experimental campaign, the seal of the press was not maintained regularly. The maintenance strategy was reactive, leading to the full degradation of its condition. This allowed tracking of important information about the failure-behavior. In real-world scenarios, it is critical to collect data of negative incidents. This typically is at odds with the key objective of maintenance organizations to ensure reliability and availability.

3.1 Comparison of Physical to Date-Centric Modelling

The identification of rules that lead to the loss of quality or function is a crucial responsibility on the shopfloor. The task is usually performed with deep domain and engineering knowledge. In order to show the benefits related to the presented methodology, the following figure and table show the results of a conventional approach of incorporating physical modelling in comparison to a data-centric approach. The goal was to model the condition of the seal of the press.

The following models were trained in the first experimental campaign and tested on the second experimental campaign. The RMSE was calculated on the test data.

Linear Regression:

Despite its static nature, univariate linear regression is widely used in the industry. The data understanding and preparation effort is straightforward. The modelling is easy and the deployment doable. The disadvantage lies in the accuracy and robustness of the model. The R2 was 0.845 on the training data.

Non-Linear Regression:

The second frequently used technique to model and predict metric variables fast is by building a non-linear regression model. The advantages and disadvantages are similar to the linear regression model, with the additional benefit that many degradation curves follow a non-linear trend (compare Fig. 3). The R² was 0,96 on the training data.

Fig. 3
A line graph of normalized condition versus the production progress. The line of real values, logarithmic, and linear begin at around 1 and then declines slowly as the progress increases.

The derived linear and logarithmic model show a reasonable fit in the training data

Multiple Regression:

In comparison to the first two static regressions, the multiple regression includes multiple variables to determine and explain the behavior of the seal condition. Variables that detect influencing effects were identified and used to model the behavior of the seal-condition as measured by in-tool sensors. The accuracy increased significantly, as factors such as the specific experimental setup were included. The R2 was 0.97 on the training data.

The comparison of the three models is summarized in Table 1. The evaluation was performed in the second campaign. It is clearly shown that the first two models initially show a good fit, but loose accuracy on new data. The RMSE are very low and indicate clear overfitting. From a user’s perspective, the acceptance is very low and the generated models will never reach the shopfloor. The multiple regression incorporated approximately 15 variables that allowed a better prediction of the seal condition. The data collection, preparation and modelling effort were significantly higher for the multiple regression, as more factors were included. The cost-benefit ratio was in favor for the data-centric approach, as robustness and applicability are needed to generate benefits from the actual effort.

Table 1 Comparison of different regression models for the seal degradation behavior

4 Conclusions & Outlook

Applying machine learning to the RTM process requires integration of disparate systems across the process chain. Capturing domain expert knowledge is the key to tailor data collection and analysis activities. This involves iterative feedback both in system layout and data collection. A high degree of automation is beneficial both to avoid operator variability as well as to formalize the process steps in preparation of data analysis. The high quantity of experiments required to approach industrial settings and relevant machine learning approaches is aided by persistent part identification by embedded bar codes.

Overcoming these challenges allows machine learning to be successfully applied as shown to discover complex interaction of process parameters to enable robust data analysis that is deployable in an industrial setting. More complex approaches such as multiple regression are enabled and show a high acceptance.

Continuing work in this research project will focus on increasing the number of analyses based on the captured data to cover more process features and derive analysis modules with meaningful output to end users.

Supplementing the presented approach future work would benefit from including additional automated quality control methods, adding depth to the captured process data.