1 Introduction

Reduced and real-time modelling techniques are part of hybrid and data driven modelling schemes, which play a crucial role in digitalisation of industrial manufacturing within the twinning and shadowing framework. They can be combined with machine learning for generating process predictor-corrector modules, data training, data fitting, and data sharing. They enable industries to optimize their manufacturing processes\sub-processes, improve production efficiency, and enhance product quality by making fast decisions in real-time. These real-time models play an important role in the performance of digital twin framework by accurately representing the physical process and its evolution in real-time. Hence the process digital twin which is a virtual replica of a physical process can be built based on real-time data and numerical simulations to resemble the characteristics of the actual process.

The implementation of real-time prediction-correction models within the digital twin framework enables the continuous update of process state to counterpart the physical process state. Hence, the digital twin can remain fully synchronized with real-world changes in the transient state of processes, while correction measures can be performed for more optimised state. These prediction and corrections can be made based on the data that have been generated using verified numerical simulations, historical recorded data and live sensor data from the physical process. In addition, the real-time models can give a visualise state of the process to the operators and end users to monitor the performance of the digital twin. This enables operators and end users to perform different process scenarios virtually and obtain valuable insights into the process conditions for better optimisation.

In recent years, much research has been conducted on data-driven and fast reduced order models to generate real-time modelling capabilities for material processing [1]. While the role of machine learning (ML) schemes and reduced order modelling (ROM) technology for material modelling are discussed in [2], and the fast models for solidification processes have been developed in [3]. For the sub-processes like cooling and heat transfer coefficients, the hybrid analytical-data driven fast models have been developed in [4, 5], while the performance of real-time modelling schemes for additive manufacturing (AM) and extrusion processes are reported in [6, 7].

2 Computational Models for Processing

By increasing computational power from the second half of the last century onwards, numerical simulations have become a vital part of design and controlling of industrial material processing, including processes like additive manufacturing, extrusion, and casting. These simulations are based on sound mathematical models, and their computational counterparts with prediction and assessment capabilities for the multi-physical and multi-scale material processing are performed to design and optimise greener and more efficient processes. The thermal and mechanical aspects of the material processing along with melt flow and cooling/solidification sub-processing have already been simulated using conventional and dynamic discretisation techniques [8, 9]. While the conventional techniques are based on the fixed size numerical domain for the simulation of transient material processes, the new dynamic mesh and evolving domain techniques can simulate the continuously growing-domain using appending/extending mesh techniques [10].

2.1 AM Process Simulation

Multi-physical numerical simulations of AM processes have been conducted using both conventional and dynamic mesh techniques [9] to predict the behaviour of materials and deposited parts. These numerical simulation techniques can take into account heat transfer (e.g., using Goldak heat source technique [10]), melt pool dynamics state, partial solidification, and even generation of residual stresses during the transient AM processes. The layer-by-layer deposition simulations can be employed to optimize AM process parameters (e.g., deposition speed, base initial temperature, torch power, cooling rates, and more) and predict the final part’s quality and worthiness. These numerical simulations can also be used to create optimal final part geometries to avoid excessive thermal warping and deformations in the thin-walled and lightweight parts by iteratively adjusting the process conditions based on simulated performance criteria.

The dynamic mesh and evolving domain techniques are novel numerical schemes where the size/shape, boundaries, or characteristics of the process domain can be updated over the process time [8]. This could include changes in processes parameters and sub-processes such as melt pool dynamics, solidification/cooling, and mechanical stress/strain state. One of the other main challenges is how to represent changes in domain size as the transient process is progressing and new layers are deposited during an AM process. This includes the change of geometry, mesh, boundary and thermal energy content within the dynamical domain which can have profound effects on the computational accuracy/time. In the dynamic mesh and evolving techniques instead of generating the domain related to the final size/shape of the part from the start (with de-activation blocks), new subdomains/layers are appended to the domain at specific time steps to resemble the layer deposition of the AM part. These newly generated layers can be attached to the main numerical domain after proper mapping and energy balancing exercises at discrete time points.

For the conventional simulation techniques, different solution schemes can be employed to solve the discretised numerical domain including Eulerian, Lagrangian, arbitrary Lagrangian Eulerian (ALE), mixed Lagrangian Eulerian (MiLE), and even some hybrid methods [8]. One of the most efficient solution schemes for conventional simulations of AM processing is ALE, where the domain evolution and its spatial configuration are referenced to three sub-domains, namely; an Eulerian, a Lagrangian and a third overlapping sub-domain through using mapping functions. For the overlapping sub-domain within the numerical domain, let’s assume two interfacing grids with overlapping at the boundaries of Ω1 and Ω2 numerical domains. For linear finite element (FE) meshes with base functions {vi}i∈I and {wj}j∈], the overlapping set of functions can be written as [8]:

$$\left\{{\phi}_{1}{v}_{i}\,,\,{\phi}_{2}{w}_{j}\right\}_{i\in I,j\in J}$$
(1)

where the overlapping sub-domain can be represented with a linear independent relation as

$${\phi}_{1}\sum_ \mathrm{i}\:{\alpha_i v_i }+{\phi}_{2}\sum_ \mathrm{j}\:\upbeta_j w_j=0$$
(2)

and if it is assumed that ∅2 = 1 − ∅1 then it can be shown that [8];

$${\phi}_{1}\left(\sum_i\alpha_i v_i - \sum_j\:\upbeta_j w_j\right)=-\sum_j \upbeta_j w_j\rightarrow \sum_i \alpha_i v_i-\sum_j \upbeta_j w_j=\text{Const}.$$
(3)

For the stiffness matrix associated with the interfacing/overlapping sub-domain, the assembly procedure can be started using firstly the non-overlapping elements (as normal mesh) and secondly, overlapping elements with multiple entries. Figure 1 shows an AM deposited wall (with Aluminum alloy), along with its layered geometry, FE mesh and contour results which are used for numerical simulation of the deposition process.

Fig. 1
figure 1

a Experimental WAAM deposition process; b wall layered geometry and meshing; c simulation contours for temperature and strain

Although ALE is a popular method for the numerical simulation of dynamic processes, other solution techniques including level set (LS), moving mesh, front tracking, and adaptive mesh refinement have also been used for these processes.

2.2 Validation and Experimental Studies

An important part of AM modelling and simulations framework is the proper validation and experimental trials to ensure the reliability and accuracy of these modelling schemes. These experimental studies can commonly be conducted parallel to the numerical simulation exercises where some specific design of experiments cases can be performed for validation of numerical simulation results. The experimental works can involve material testing and characterisations, material microstructure/texture, process parameters variations and thermal-mechanical defects and imperfections. Different validation techniques including deterministic, statistical and/or hybrid schemes can be employed for comparative analyses and iterative loops might be utilised to improve the calibration of numerical models.

Wire-based additive manufacturing (WAAM), which can be classified as directed energy deposition scheme, is one of the popular AM manufacturing processes where layers of materials are deposited and welded together using arc, laser, or electron beam energy sources. During the WAAM process, a specific wire material is used to feed a moving nozzle system where the feeding materials are melted using an electric arc, plasma beam or electron beam. The deposition process starts at base plate with pre-defined initial temperature and proceed layer by layer to create a complex geometry part [10]. In this research work, description of ongoing work on the parallel experimental-numerical simulation, reduced and real-time modelling strategies for WAAM processes have been elaborated. Although single-wire feed is a common practice for the metal WAAM processes, due to limitation of alloy compositions in the available commercial wires, multiple wire feeding with varying speeds are alternatively used for rapid and cost-effective experimentations of alloy compositions.

Although different light weight alloys including aluminium EN AW-2319, EN AW-5087 and some 4xxx-series alloy are commonly used for the deposition, the use of other high performance aluminium alloys like EN AW-6061 and EN AW-7075 are limited due to the hot tearing susceptibility. The recent research on light weight alloy developments has shown that there is quite a potential for high performance alloy deposition using WAAM technique [11].

For the case study in this research work, a simple WAAM process consisting of a single layer deposition over a thermal preconditioned wall is simulated. This would resemble a typical deposition of a layer during the manufacturing process. The layer is deposited over a pre-built wall with 100 mm × 50 mm × 6 mm in dimension (single wall deposition), where the thickness of the layer is 1.6 mm with same width as the pre-built wall. The Aluminium EN AW-6061 wires are used for the deposition process where a single layer is deposited over the wall with the same material.

3 Digitalisation Framework

Various digitalisation frameworks have been developed for industrial processes, where process technologies, manufacturing strategies, and processes data are used to create a data-driven manufacturing environment that improves energy consumption, efficiency, quality, and innovative design of manufacturing chains. It involves production strategies and manufacturing technologies for acquiring, analysing, and employing digital data to improve and optimize various aspects of the manufacturing process. It generally involves data collection/handling, data management/filtering, digital twinning/shadowing, real-time modelling/monitoring and automation aspects where human-machine interfacing and continuous improvements are achieved.

3.1 Digital Twin and Data Driven Models

The data and knowledge representations in a structured and semantic way are popular concepts which have been proposed in formal ontological frameworks for industrial material processing. Ontology and its data classification features play vital roles in organizing, communication, interfacing and information sharing for industrial processes and their sub-processes. Since the meta data becomes more important in the context of industrial processes, both ontology and taxonomy are essential components for managing and organizing data and its communication effectively. The hierarchical taxonomy classification scheme which can be used to categorize and organize process data based on their characteristics/attributes is a first step towards the practical metadata handling/processes. While the ontology framework goes one step further and defines relationships between data bases on the basis of proper logical axiom propositions.

The ontological framework for industrial material processing can provide logical, structured, and systematic representations of different aspects of these processes, including materials properties/characteristics, equipment, process routines, and relationships between different process components in a standard way. Since the digitalization of material processing involves the transformation and translation of real-world/physical processes and their essential information into digital formats, the role of ontology and data science for conversion of these processes into digital data and twinning/shadowing schemes is vital. Hence, the combination of a proper ontological framework, effective digitalisation\twinning scheme, fast real-time predictive-corrective models, and adequate data handling\processing infrastructure are required to carry out process optimization, reducing costs, improved efficiency, and enhanced contributions to the circular economy.

To start developing a general outline for greener and smarter additive manufacturing processing with more innovation in terms of accelerated design and optimisation, a general framework can be proposed. The overall concepts of ontology and digital material along with the digital process twinning/shadowing can be defined within this framework where the application of real-time models for optimisation and goal-seeking procedure are included. The centre stone of such a development is to integrate tailored data science techniques into material modelling and characterisation to boost the speed and accuracy of the material and product developments. Hence, the framework can be designed in such a way as to deliver a balanced and efficient physical and data driven experimental and simulation parallel scheme where integrated ML and reduced modelling are employed to enhance the framework. Figure 2 shows the proposed framework for the smarter and greener AM production scheme where the concept of physical-data driven (PDD) is employed to define standard data search spaces for material modelling\characterisations and processing based on chemical, physical and life cycle parameters.

Fig. 2
figure 2

Framework for greener WAAM production scheme with PDD definition of material and process data

3.2 ML and Real-time Modelling

One of the new approaches in manufacturing industries to increase the generality and accuracy of physical-based models is to enhance them with the power of data science techniques. This can start from simple fitting of parameters during processing, or it can even be expanded further by generating hybrid physical-data driven models (so called auxiliary and augmented models) to broaden the process modelling generality. The further introduction of ML and smart data processing techniques can enhance the process modelling by generating smarter and more general models where trends and rules can be defined for available scattered data. Ultimately, this can lead to the production of the fast real-time predictor-corrector models for optimisation and controlling of AM processes where different concepts of a digital process twin/shadow can be established. In this essence, multi-dimensional data search spaces are defined to represent the variation of process and material parameters for the goal-seeking exercise.

The manufacturing processing are often considered as the multi-physical and multi-scale processes since change of material phases (e.g., melting, solidification …), thermal evolution (heating and cooling), mechanical stress/strain occurrences (e.g., residual stress/strains …), and intrinsic microstructure evolutions are natural events for most of these processes [12]. For the multi-scale aspects of these processes, starting from the ab initio state upwards, material characteristics can be developed based on the chain of process parameters, boundary conditions, and properties of alloy components, while, for the multi-physical aspects of the manufacturing processes, fluid-thermal-mechanical modelling and simulations are popular practices for design and optimisation of these processes. However, data driven reduced and real-time models for multi-scale and multi-physical modelling which involve complex data models that consider various physical and scale-related process phenomena are alternative models that are faster and cheaper than their conventional counterparts.

In this research work, the concepts of creating fast and reliable data models for AM processes were undertaken and a robust framework has been setup to create enough data for the real-time models of AM processes. It involves various key steps to guarantee the accuracy, reliability, and relevance of the models;

  • The first step is to define the objectives and goals for the real-time models. In this sense, the objectives and targets of these data-driven models are defined, and the key parameters related to these goals are established (e.g., thermal-mechanical predictions, grain size estimation, cracking/warping …)

  • In the next step, the sources for the reliable and verified data have been identified and a parallel numerical simulation-experimental framework has been setup for data generation and verifications.

  • Subsequently, multi-dimensional search spaces and proper sampling schemes are defined based on the key parameters which involve process and material parameters.

  • Furthermore, snapshot matrices are generated based on the sampling method to represent the process characteristics at normal, near limits and extreme cases.

  • Based on the snapshot matrices, process scenarios are simulated using verified numerical simulations (e.g., thermal-mechanical)

  • Real-time models are then generated for the WAAM processes using a proper combinations of data solvers-interpolators which involve data decomposition, projection, and interpolation

  • In the next step, some design of experiments (DOEs) cases were carried out to generate validation data for the WAAM real-time models

  • In the last steps, the accuracy and reliability of these fast models were examined using comparative study between results of DOEs and real-time models for normal, near limit and extreme WAAM processes.

For the process databases which have been used for real-time models, the data generation and collection/pre-processing, data storage/labelling and data training/validation have been conducted to create effective models for WAAM process. Figure 3a shows the schematic view of the general multi-scale and multi-physical framework for the digitalisation of the WAAM process. While Fig. 3b represent the graphic workflow for the real-time modelling of WAAM processes using material and process data.

Fig. 3
figure 3

a Schematic view for multi-scale and multi-physical framework for WAAM digitalisation, b graphic representation of real-time modelling workflow for WAAM processes

4 WAAM Case Study

The WAAM is a popular direct deposition process which has some advantages compared to other AM methods where higher productivity rate, deposition of larger size parts and material availability (e.g., metallic wire) are foreseen. The lightweight metallic alloys are among common deposition materials for the WAAM processing which might include aluminium, titanium and even magnesium alloys in various wire shapes and sizes. Recently, the numerical simulation of WAAM processes (e.g., thermal-mechanical FE simulations) has promoted new routines for the optimizing and controlling of these processes where higher performance can be achieved by off-line and/or live simulations. However, these numerical simulations are often time consuming and require substantial computing resources for accurate presentation of the physical process. Hence, attention has been paid to the data reduced and real-time models where reasonable predictions/corrections and continuous process improvements can be achieved in much quicker time with less computational resources.

4.1 Data Solvers/Interpolators

Different data solvers and interpolators are commonly used in engineering application for decomposition and projection of data for reduced dimensionality. However, due to complexities related to the manufacturing process data (e.g., high heating/cooling data gradients, complex boundary and initial values …), only few of these methods can realistically be used to build accurate real-time models for these processes. In this research work, after comprehensive performance study among various data solvers and interpolators, few model building techniques have been employed for WAAM real-time model building.

SVD—The singular value decomposition (SVD) data solver is a more generalised adaptation of the eigen-based proper orthogonal decomposition (POD) technique which is a dimensional-reduction scheme based on the linear algebra [1, 4]. It works by decomposing the data matrix into three other matrices to capture the important characteristic of the data (i.e., through eigenvalue analyses). The mathematical representation of the SVD technique can be written as [1];

$$\mathrm{Ax\ }=\mathrm{F}(\mathrm{x}{,}\mathrm{t})\ \rightarrow \mathrm{F}(\mathrm{x}{,}\mathrm{t})=\ \mathrm{U}\sum V^{T}$$
(4)

Where x is the eigenvector of A, the de-composed matrices of U and VT are orthogonal matrices related to spatial and temporal decomposition and Σ is the eigen value matrix.

Regression—Conventional and symbolic regression techniques are among popular data solver techniques to model the relationship between variables (e.g., process parameters), with the goal of predicting or estimating variation of system responses based on one or more other variables (the independent or predictor variables). Different regression techniques like linear, Ridge and Lasso, polynomial, support vector regression (SVR), random forest regression (RFR), gradient boosting regression (GBR), and time series regressions are among popular data techniques widely used in various branches of science and engineering [1].

Kriging—One of the popular data solver/interpolators for engineering applications is Kriging, which is a statistical scheme capable of modelling the spatial or temporal variation of parameters and provides accurate estimates at unsampled time/locations. It works based on the variation of values observed at sampled locations to make predictions while considering the spatial correlation with nearby points and the minimisation of uncertainty associated with those predictions [1]. Different estimation methods are used for Kriging technique including ordinary, simple, and universal methods where constant and non-constant mean values along with variogram spatial-dependency and covariances are used to improve predictions.

4.2 WAAM Real-time Models

Different real-time model building schemes have been employed for WAAM processes where the thermal, mechanical stress/strain evolutions within the deposited part along with dynamic process development are considered. The utilization of these models to provide insight information, predictions/corrections, and process control routines during the actual manufacturing process are undertaken. In this research work, a real-time predictive-corrective modelling scheme has been implemented where the predictive power of reduced models and proper ML schemes were employed. Let’s consider the decomposition of the temperature field for WAAM deposition as,

$$[\mathrm{T}(\mathrm{x}{,}\mathrm{t})]=\mathrm{U}\sum V^{T}\rightarrow \ \mathrm{T_k}(\mathrm{x}{,}\mathrm{y}{,}\mathrm{z}{,}\mathrm{t})=\mathrm{U_k}\sum_k V^{T}_k$$
(5)

where Tk are temperatures at process time t. After the decomposition of temperature data, the interpolation for unsampled points can be performed using radial basis function (RBF) or adaptive redial basis function as

$$\begin{array}{cc} \mathrm{f}(\mathrm{T})=\sum ^{\mathrm{k}}_{i=1} & a_i \upvarphi (\| \mathrm{T}-\mathrm{T_i}\| ) \end{array}$$
(6)

where ai and φ are the weighting coefficient for the RBF. To start generating enough data for a WAAM real-time, the following steps were carried out:

  • Using the verification described in Sect. 2.2, FE simulation of WAAM process has been calibrated for further scenarios.

  • Using material and process parameters, sizes and boundaries of search spaces are defined (e.g., deposition machine limits)

  • A snapshot matrix has then been defined within the multi-dimensional search space to carry out FE scenarios with varying parameters.

  • FE results for these scenarios are post-processed to form a process database for model building exercises.

  • Real-time models were built using different combinations of data solvers-interpolators based on the process database.

  • Some design of experiments (DOEs) cases have also been performed using FE simulations and their results are used to validate these real-time models at normal, near-boundary and extreme cases.

5 Discussion

To assess the accuracy, efficiency, and reliability of fast real-time models for WAAM processes, a comparative study has been carried out firstly to compare the DOEs results with real-time models and secondly to compare the results of different solvers-interpolators techniques. Comparing the results of detailed numerical simulations with fast real-time models involves obtaining data related to some key points within the model for comparison purposes. For the real-time model building, it is crucial to have through understanding of the process, its multi-physical state and the source of data for model building (e.g., live sensor data, simulation and/or experimental data, mined data). The data preparation including handling and filtering schemes along with the selection of appropriate data solver and interpolator are contributing in the success of the model building exercise.

Figure 4a,b show the estimation of transient temperature results for a key point along the deposited wall for the DOEs simulations and real time models (SVD and Kriging techniques). The computational time for the DOEs simulations including data input/output (IO) is recorded as 2665 s (wall clock time) for every scenario on a cluster computing unit with four cores, while the recorded time for the reduced models on a single core processor is about 0.7 s for the estimation of the whole transient responses. Figure 4c shows the comparison of calculated normalised errors for both SVD and Kriging models with respect to DOEs simulation results, while Fig. 4d shows the Pearson linear correlation index for the temperature predictions by real-time models when the temperature gradients are changing.

Fig. 4
figure 4

a,b Transient temperature results for FE and real-time reduced order models (ROMs); c their calculated normalised errors and; d correlation index for temperature rate predictions

During the WAAM process, when torch is passing over the measuring points, there is a sudden temperature rise in the material (with partial re-melting of material). Due to this very high temperature gradient, the predictions made by the real-time data models might encounter significant diversion (rate dependency) from physical measurements.

To investigate the performance of these real time models for high heating/cooling rates, a comprehensive study has been conducted to examine the accuracy of different model building techniques. The Pearson linear correlation index between the calculated normalised error and the temperature gradient shows that some data solvers are performing more efficiently for the processes with high data-gradients as shown in Fig. 4d.

6 Conclusion

Data-driven reduced modelling concepts can notionally be used to build real-time models for manufacturing processes and sub-processes applications. However, due to intrinsic complexities of these processes including multi-physical and multi-scale phenomena, their applications need to be scrutinised for performance, reliability, and accuracy. In comparison to detailed numerical simulations, real-time data models can provide fast insights into the process’s conditions and operational controlling. While sophisticated multi-physical numerical simulations with their coupled mathematical models can offer accurate analyses and understanding of complex processes after committing vast computational resources and long CPU times. In this research work, a brief description of the numerical simulation techniques for AM processes was presented at the start of the contribution. In the following sections, a short introduction into the digitalisation frameworks and their popular reduced model building techniques was demonstrated. In the last parts of the contribution, real-time model descriptions for a real-world case study (i.e., WAAM process) have been elaborated and its demanding aspects of accuracy and heat rate-dependency have been examined. The detailed study has shown that the adaptation of appropriate combinations of solvers-interpolators can produce reliable results even for the processes with high heating rates. As a final comment, it should be mentioned that these data models are not envisioned to replace numerical simulations and/or experimental works, yet they can be a valuable tool for the digitalisation of manufacturing processes.