Abstract
Decision making for competitive production in highwage countries is a daily challenge where rational and irrational methods are used. The design of decision making processes is an intriguing, discipline spanning science. However, there are gaps in understanding the impact of the known mathematical and procedural methods on the usage of rational choice theory. Following Benjamin Franklin’s rule for decision making formulated in London 1772, he called “Prudential Algebra” with the meaning of prudential reasons, one of the major ingredients of MetaModelling can be identified finally leading to one algebraic value labelling the results (criteria settings) of alternative decisions (parameter settings). This work describes the advances in MetaModelling techniques applied to multidimensional and multicriterial optimization in laser processing, e.g. sheet metal cutting, including the generation of fast and frugal MetaModels with controlled error based on model reduction in mathematical physical or numerical model reduction. Reduced Models are derived to avoid any unnecessary complexity. The advances of the MetaModelling technique are based on three main concepts: (i) classification methods that decomposes the space of process parameters into feasible and nonfeasible regions facilitating optimization, or monotone regions (ii) smart sampling methods for faster generation of a MetaModel, and (iii) a method for multidimensional interpolation using a radial basis function network continuously mapping the discrete, multidimensional sampling set that contains the process parameters as well as the quality criteria. Both, model reduction and optimization on a multidimensional parameter space are improved by exploring the data mapping within an advancing “Cockpit” for Virtual Production Intelligence.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
1 Introduction
Routes of Application
At least two routes of direct application are enabled actually by MetaModelling, namely, decision making and evaluation of description models. While calculating multiobjective weighted criteria resulting in one algebraic value applies for decision making, multiparameter exploration for the values of one selected criterion is used for evaluation of the mathematical model which was used to generate the MetaModel.
Visual exploration and Dimensionality Reduction
More sophisticated usage of MetaModelling deals with visual exploration and data manipulation like dimensionality reduction. Tools for viewing multidimensional data (Asimov 2011). are well known from iterature. Visual Exploration of High Dimensional Scalar Functions (Gerber 2010) today focusses on steepestgradient representation on a global support, also called MorseSmale Complex. The scalar function represents the value of the criterion as function of the different parameters. As result, at least one trace of steepest gradient is visualized connecting an optimum with a minimum of the scalar function. Typically, the global optimum performance of the system, which is represented by a specific point in the parameter space, can be traced back on different traces corresponding to the different minima. These trace can be followed visually through the highdimensional parameter space revealing the technical parameters or physical reasons for any deviation from the optimum performance.
Analytical methods for dimensionality reduction, e.g. the wellknown Buckingham ΠTheorem (Buckingham 1914), are applied since 100 years for determination of the dimensionality as well as the possible dimensionless groups of parameters. Buckingham’s ideas can be transferred to data models. As result, methods for estimating the dimension of data models (Schulz 1978), dimensionality reduction of data models as well as identification of suitable data representations (Belkin 2003) are developed.
Value chain and discrete to continuous support
The value chain of MetaModelling related to decision making enables the benefit rating of alternative decisions based on improvements as result of iterative design optimization including model prediction and experimental trial. One essential contribution of MetaModelling is to overcome the drawback of experimental trials generating sparse data in highdimensional parameter space. Models from mathematical physics are intended to give criterion data for parameter values dense enough for successful MetaModeling. Interpolation in MetaModelling changes the discrete support of parameters (sampling data) to a continuous support. As result of the continuous support, rigorous mathematical methods for data manipulation are applicable generating virtual propositions
Resolve the dichotomy of cybernetic and deterministic approaches
MetaModelling can be seen as a route to resolve the separation between cybernetic/empirical and deterministic/rigorous approaches and to bring them together as well as making use of the advantages of both. The rigorous methods involved in MetaModelling may introduce heuristic elements into empirical approaches and the analysis of data, e.g. sensitivity measures, may reveal the sound basis of empirical findings and give hints to reduce the dimension of MetaModels or at least to partially estimate the structure of the solution not obvious from the underlying experimental/numerical data or mathematical equations.
2 MetaModelling Methods
In order to gain a better insight and improve the quality of the process, the procedure of conceptual design is applied. The conceptual design is defined as creating new innovative concept from simulation data (Currie 2005). It allows creating and extracting specific rules that potentially explain complex processes depending on industrial needs.
Before applying this concept, the developers are validating their model by performing one single simulation run (s. Application: sheet metal drilling) fitting one model parameter to experimental evidence. This approach requires sound phenomenological insight. Instead of fitting the model parameter to experimental evidence multiphysics, complex numerical calculations can be used to fit the empirical parameters of the reduced model. This requires a lot of scientific modelling effort in order to achieve good results that could be comparable to real life experimental investigation.
Once the model is validated and good results are achieved, the conceptual design analysis is then possible either to understand the complexity of the process, optimize it, or detect dependencies. The conceptual design analysis is based on simulations that are performed on different parameter settings within the full design space. This allows for a complete overview of the solution properties that contribute well to the design optimization processes (Auerbach et al. 2011). However the challenge rises when either the number of parameters increases or the time required for each single simulation grows.
Theses drawbacks can be overcome by the development of fast approximation models which are called metamodels. These metamodels mimic the real behavior of the simulation model by considering only the input output relationship in a simpler manner than the full simulation (Reinhard 2014). Although the metamodel is not perfectly accurate like the simulation model, yet it is still possible to analyze the process with decreased time constraints since the developer is looking for tendencies or patterns rather than values. This allows analyzing the simulation model much faster with controlled accuracy.
Metamodelling techniques rely on generating and selecting the appropriate model for different processes. They basically consist of three fundamental steps: (1) the creation and extraction of simulation data (sampling), (2) the mapping of the discrete sampling points in a continuous relationship (interpolation), and (3) visualization and user interaction of this continuous mapping (exploration)

1.
Sampling

2.
Interpolation

3.
Exploration.
2.1 Sampling
Sampling is concerned with the selection of discrete data sets that contain both input and output of a process in order to estimate or extract characteristics or dependencies. The procedure to efficiently sampling the parameter space is addressed by many Design of Experiments (DOE) techniques. A survey on DOE methods focusing on likelihood methods can be found in the contribution of Ferrari et al. (Ferrari 2013). The basic form is the Factorial Designs (FD) where data is collected for all possible combinations of different predefined sampling levels of the full the parameter space (Box and Hunter 1978).
However for high dimensional parameter space, the size of FD data set increases exponentially with the number of parameters considered. This leads to the wellknown term “Curse of dimensionality” that was defined by Bellman (Bellman 1957) and unmanageable number of runs should be conducted to sample the parameter space adequately. When the simulation runs are time consuming, these FD design could be inefficient or even inappropriate to be applied on simulation models (Kleijnen 1957).
The suitable techniques used in simulation DOE are those whose sampling points are spread over the entire design space. They are known as space filling design (Box and Hunter 1978), the two wellknown methods are the Orthogonal arrays, and the Latin Hypercube design.
The appropriate sample size depends not only on the number of the parameter space but also on the computational time for a simulation run. This is due to the fact that a complex nonlinear function requires more sampling points. A proper way to use those DOE techniques in simulation is to maximize the minimum Euclidean distance between the sampling points so that the developer guarantees that the sampling points are spread along the complete regions in the parameter space (Jurecka 2007).
2.2 Interpolation
The process in which the deterministic discrete points are transformed into a connected continuous function is called interpolation. One important aspect for the Virtual Production Intelligence (VPI) systems is the availability of interpolating models that represent the process behavior (Reinhard 2013), which are the metamodels. In VPI, metamodeling techniques offer excellent possibilities for describing the process behavior of technical systems (Jurecka 2007; Chen 2001) since MetaModelling defines a procedure to analyze and simulate involved physical systems using fast mathematical models (Sacks 1989). These mathematical models create cheap numeric surrogates that describe causeeffect relationship between setting parameters as input and product quality variables as output for manufacturing processes. Among the available MetaModelling techniques are the Artificial Neural Networks (Haykin 2009), Linear Regression Taylor Expansion (Montgomery et al. 2012) Kriging (Jones 1998; Sacks 1989; Lophaven 1989), and the radial basis functions network (RBFNs). RBFN is well known for its accuracy and its ability to generate multidimensional interpolations for complex nonlinear problems (Rippa 1999; Mongillo 2010; Orr 1996). A Radial Basis Function Interpolation represented in Fig. 6.1 below is similar to a three layer feed forward neural network. It consists of an input layer which is modeled as a vector of real numbers, a hidden layer that contains nonlinear basis functions, and an output layer which is a scalar function of the input vector.
The output of the network f(x) is given by:
where \( {\text{n}}, {\text{h}}_{\text{i}} ,{\text{w}}_{\text{i}} \) correspond to number of sampling points of the training set, the ith basis function, and the ith weight respectively. The RBF methodology was introduced in 1971 by Rolland Hardy who originally presented the method for the multiquadric (MQ) radial function (Hardy 1971). The method emerged from a cartography problem, where a bivariate interpolates of sparse and scattered data was needed to represent topography and produce contours. However, none of the existing interpolation methods (Fourier, polynomial, bivariate splines) were satisfactory because they were either too smooth or too oscillatory (Hardy 1990). Furthermore, the nonsingularity of their interpolation matrices was not guaranteed. In fact, Haar’s theorem states that the existence of distinct nodes for which the interpolation matrix associated with nodeindependent basis functions is singular in two or higher dimensions (McLeod 1998). In 1982, Richard Franke popularized the MQ method with his report on 32 of the most commonly used interpolation methods (Franke 1982). Franke also conjectured the unconditional non singularity of the interpolation matrix associated with the multiquadric radial function, which was later proved by Micchelli (Micchelli 1986). The multiquadric function is used for the basis functions \( {\text{h}}_{\text{i}} \):
where \( {\text{x}}_{\text{i}} \) and \( {\text{r}} \) represent the ith sampling point and the width of the basis function respectively. The shape parameter \( {\text{r}} \) controls the width of the basis function, the larger or smaller the parameter changes, the narrower or wider the function gets. This is illustrated in Fig. 6.2 below.
The learning of the network is performed by applying the method of least squares with the aim of minimizing the sum squared error with respect to the weights \( {\text{w}}_{\text{i}} \) of the model (Orr 1996). Thus, the learning/training is done by minimizing the cost function
where \( \uplambda \) is the usual regularization parameter and \( {\text{y}}_{\text{i}} \) are the criterion values at points i. Solving the equation above
with
and
The chosen width of the radial basis function plays an important role in getting a good approximation. The following selection of the r value was proposed by Hardy (1971) and taken over for this study:
and \( {\text{d}}_{\text{i}} \) is the distance between the ith data point and its nearest neighbor.
2.3 Exploration
Visualization is very important for analyzing huge sets of data. This allows an efficient decision making. Therefore, multidimensional exploration or visualization tools are needed. 2D Contour plots or 3D cube plots can be easily generated by any conventional mathematical software. However nowadays, visualization of high dimensional simulation data remains a core field of interest. An innovative method was developed by Gebhardt (2013). In the second phase of the Cluster of Excellence “Integrative Production Technology for HighWage Countries” the Virtual Production Intelligence (VPI). It relies on a hyperslicebased visualization approach that uses hyperslices in combination with direct volume rendering. The tool not only allows to visualize the metamodel with the training points and the gradient trajectory, but also assures a fast navigation that helps in extracting rules from the metamodel; hence, offering an userinterface. The tool was developed in a virtual reality platform of RWTH Aachen that is known as the aixCAVE. Another interesting method called the MorseSmale complex can also be used. It captures the behavior of the gradient of a scalar function on a high dimensional manifold (Gerber 2010) and thus can give a quick overview of high dimensional relationships.
3 Applications
In this section, the metamodeling techniques are applied to different laser manufacturing processes. The first two applications (Laser metal sheet cutting and Laser epoxy cut) where considered a data driven metamodeling process where models where considered as a black box and a learning process was applied directly on the data. The last two applications (Drilling and Glass Ablation) a model driven metamodeiling process was applied.
The goal of this section is to highlight the importance of using the proper metamodeling technique in order to generate a specific metamodel for every process. The developer should realize that generating a metamodel is a user demanding procedure that involves compromises between many criteria and the metamodel with the greatest accuracy is not necessarily the best choice for a metamodel. The proper metamodel is the one which fits perfectly to the developer needs. The needs have to be prioritized according to some characteristics or criteria which was defined by Franke (1982). The major criteria are accuracy, speed, storage, visual aspects, sensitivity to parameters and ease of implementation.
3.1 Sheet Metal Cutting with Laser Radiation
The major quality criterion in laser cutting applications is the formation of adherent dross and ripple structures on the cutting kerf surface accompanied by a set of properties like gas consumption, robustness with respect to the most sensitive parameters, nozzle standoff distance and others. The ripples measured by the cut surface roughness are generated by the fluctuations of the melt flow during the process. One of the main research demands is to choose parameter settings for the beamshaping optics that minimize the ripple height and changes of the ripple structure on the cut surface. A simulation tool called QuCut reveals the occurrence of ripple formation at the cutting front and defines a measure for the roughness on the cutting kerf surface. QuCut is developed at Fraunhofer ILT and the department Nonlinear Dynamics of Laser Processing (NLD) at RWTH Aachen as a numerical simulation tool for CW laser cutting taking into account spatially distributed laser radiation. The goal of this use case was to find the optimal parameters of certain laser optics that result in a minimal ripple structure (i.e. roughness). The 5 design parameters of a laser optic (i.e. the dimensions of vector in formulas (6.1–6.5)) investigated here are the beam quality, the astigmatism, the focal position, and the beam radius in x and y directions of the elliptical laser beam under consideration. The properties of the fractional factorial design are listed in Table 6.1.
The selected criteria (i.e. \( y \)vector in formulas (6.3–6.5)) was the surface roughness (Rz in µm) simulated at a 7 mm depth of an 8 mm workpiece. The full data set was 24948 samples in total. In order to assess the quality of the mathematical interpolation, 5 different RBFN metamodels were generated according to 5 randomly selected sample sets of size 1100, 3300, 5500, 11100 and 24948 data points from the total dataset. As shown in Fig. 6.3, the metamodels are denoted by Metamodel (A–E). Metamodel F, which is a 2D metamodel with a finer sampling points denoted by the blue points, is used as a reference for comparison.
A 2fold crossvalidation method was then used to give the quality of metamodeling, where 10 % of the training point sample was left out randomly of the interpolation step and used for validation purposes. The Mean Absolute Error (MAE) of the criterion surface roughness and the coefficient of determination (R2) were then calculated and compared to each other. The results are listed in Table 6.2.
The results show that the quality of the metamodel is dependent on the number of sampling points; the quality is improved when the number of training points is increased. As visualization technique contour plots were used, which in their entirety form the process map. The starshaped marker, denoting the seed point of the investigation, represents the current cutting parameter settings and the arrow trajectory shows how an improvement in the cut quality is achieved. The results show that in order to minimize the cutting surface roughness in the vicinity of the seed point, the beam radius in the feed direction x should be decreased and the focal position should be increased Eppelt and Al Khawli (2014) In the special application case studied here the minimum number of sampling points with an RBFN model is already a good choice for giving an optimized working point for the laser cutting process. These metamodels have different accuracy values, but having an overview of the generated tendency can support the developer with his decision making step.
3.2 Laser Epoxy Cut
One of the challenges in cutting glass fiber reinforced plastic by using a pulsed laser beam is to estimate achievable cutting qualities. An important factor for the process improvement is first to detect the physical cutting limits then to minimize the damage thickness of the epoxyglass material. EpoxyCut, also a tool developed at Fraunhofer ILT and the department Nonlinear Dynamics of Laser Processing (NLD) at RWTH Aachen, is a global reduced model that calculates the upper and lower cutting width, in addition to other criteria like melting threshold, time required to cut through, and damage thickness. The goal of this test case was to generate a metamodel to the process in order to: (i) minimize the lower cutting width; (ii) detect the cutting limits; and (iii) to efficiently generate an accurate metamodel and at the same time use the minimal number of simulation runs. The process parameters are the pulse duration, the laser power, the focal position, the beam diameter and the Rayleigh length. In order to better understand the idea of the smart sampling technique, the focal position, the beam diameter, and the Rayleigh length were fixed. In order to generate a fine metamodel, a 20 level full factor design was selected, this leads to a training data set that contains 400 simulation runs in total illustrated as small white round points in Fig. 6.4.
The metamodel takes the discrete training data set as an input, and provides the operator with a continuous relationship of the pulse duration and laser power (parameters) and cutting width (quality) as an output.
In order to address the first goal which is to minimize the lower cutting width, the metamodel above allows a general overview of the the 2D process model.
It can be clearly seen that one should either decrease the pulse duration or the laser power so that the cutting limits (the blue region determines the no cut region) are not achieved. The second goal was to include the cutting limits in the Metamodelling generation. When performing a global interpolation techniques, the mathematical value that represents the no cutting regions (the user set it to 0 in this case) affects the global interpolation techniques.
To demonstrate this, a Latin Hypercube Design with 49 training points was used (big white circles) with an RBFN interpolation. The results are shown in Fig. 6.5.
From the results in Fig. 6.5, the developer can totally realize that process that contains discontinuities, or feasible and nonfeasible points (in this case cut and no cut) should be classified first into feasible metamodel and a dichotomy metamodel.
The feasible metamodel improves the prediction accuracy in the cut region and the dichotomy metamodel states whether the prediction lies in a feasible or a nonfeasible domain. This is one of the development fields that the authors of this paper are focusing on.
To address the third goal which is to efficiently generate an accurate metamodel and while using the minimal number of simulation runs. A smart sampling method at the department Nonlinear Dynamics of Laser Processing NLD at RWTH is currently being developed and will be soon published. The method is based on a classification technique with a sequential approximation optimization where training points are being iteratively sampled based on defined statistical measures. The results are shown in Fig. 6.6.
3.3 Sheet Metal Drilling
As example for heuristic approaches a reduced model for sheet metal drilling has been implemented based on the heuristic concept of an ablation threshold. The calculated hole shapes have been compared with experimental observations. Finally, by exploring the parameter space the limits of applicability are found and the relation to an earlier model derived from mathematical physics is revealed. Let Θ denote the angle between the local surface normal of the sheet metal surface and the incident direction of the laser beam. The asymptotic hole shape is characterized by a local angle of incidence Θ which approaches its asymptotic value Θ_{Th} .The reduced model assumes that there exists an ablation threshold characterized by the threshold fluence F_{th} which is material specific and has to be determined to apply the model.
One single simulation run is used to estimate the threshold fluence F_{Th} where the width of the drill at the bottom is fitted. As consequence the whole asymptotic shape of the drilled hole is calculated and is illustrated in Fig. 6.7.
Finally, classification of sheet metal drilling can be performed by identification of the parameter region where the drill hole achieves its asymptotic shape. It is worth to mention, that for the limiting case of large fluence \( F \gg F_{th} \) the reduced model is well known from literature (Schulz 1986, 1987) and takes the explicit form:
where z(x) are is the depth of drilled wall and x is the lateral coordinate with respect to the laser beam axis.
3.4 Ablation of Glass
As an example for classification of a parameter space we consider laser ablation of glass with ultrashort pulses as a promising solution for cutting thin glass sheets in display industries. A numerical model describes laser ablation and laser damage in glass based on beam propagation and nonlinear absorption as well as generation of free electrons (Sun 2013). Freeelectron density increases until reaching the critical electron density \( \uprho_{\text{crit}} = (\omega^{2} m_{e} \varepsilon_{0} )/e = 3.95 \times 1021\,{\text{cm}}^{  3} , \) which yields the ablation threshold.
The material near the ablatedcrater wall will be modified due to the energy released by highdensity freeelectrons. The threshold electron density ρ_{damage} for laser damage is a material dependent quantity, that typically has the value ρ_{damage} = 0.025 ρ_{crit} and is used as the damage criterion in the model. Classification of the parameter region where damage and ablation takes place reveal the threshold in terms of intensity, as shown in Fig. 6.8 below, change from an intensity threshold at nspulses to a fluence threshold at pspulses.
4 Conclusion and Outlook
This contribution is focused on the application of the MetaModeling techniques towards Virtual Production Intelligence. The concept of Metamodelling is applied to laser processing, e.g. sheet metal cutting, sheet metal drilling, glass cutting, and cutting glass fiber reinforced plastic. The goal is to convince the simulation analysts to use the metamodeling techniques in order to generate such process maps that support their decision making. The techniques can be applied to almost any economical, ecological, or technical process, where the process itself is described by a reduced model. Such a reduced model is the object of MetaModeling and can be seen as a data generating black box which operates fast and frugal. Once an initial reduced model is set then data manipulation is used to evaluate and to improve the reduced model until the desired model quality is achieved by iteration. Hence, one aim of MetaModeling is to provide a concept and tools which guide and facilitate the design of a reduced model with the desired model quality. Evaluation of the reduced model is carried out by comparison with rare and expensive data from more comprehensive numerical simulation and experimental evidence. Finally, a MetaModel serves as a user friendly lookup table for the criteria with a large extent of a continuous support in parameter space enabling fast exploration and optimization.
The concept of MetaModeling plays an important role in improving the quality of the process since: (i) it allows a fast prediction tool of new parameter settings, providing mathematical methods to carry out involved tasks like global optimization, sensitivity analysis, parameter reduction, etc.; (ii) allows a fast user interface exploration where the tendencies or patterns are visualized supporting intuition; (iii) replaces the discrete data of current conventional technology tables or catalogues which are delivered almost with all the manufacturing machines for a good operation by continuous maps.
It turns out that a reduced model or even a MetaModel with the greatest accuracy is not necessarily the “best” MetaModel, since choosing a MetaModel is a decision making procedure that involves compromises between many criteria (speed, accuracy, visualization, complexity, storage, etc.) of the MetaModel quality. In the special application case studied here the minimum number of sampling points with a linear regression model is already a good choice for giving an optimized working point for sheet metal cutting, if speed, storage and fast visualization are of dominant interest. On the other hand when dealing with high accuracy goals especially when detecting physical limits, smart sampling techniques, nonlinear interpolation models and more complex metamodels (e.g. with classification techniques) are suitable.
Further progress will focus on improving the performance of generating the metamodel especially developing the smart sampling algorithm, and verifying it on other industrial applications. Additional progress will focus on allowing the creation of a metamodel that handles distributed quantities and not only scalar quantities. Last but not least, these metamodels will be interfaced to global sensitivity analysis technique that helps to extract knowledge or rules from data.
References
Asimov D (2011) The grand tour: a tool for viewing multidimensional data, SIAM J. Sci. Stat. Comput. 1985;6(1):128–143
Auerbach T, M. Beckers, G. Buchholz, U. Eppelt,Y. Gloy, P. Fritz, T. Al Khawli, S. Kratz, J. Lose, T. Molitor, A. Reßmann, U. Thombansen, D. Veselovac, K. Willms, T. Gries, W. Michaeli, C. Hopmann, U., R.Schmitt, and F Klocke (2011) Metamodeling for Manufacturing, ICIRA 2011, Part II, LNAI 7102, pp. 199–209, 2011.
Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation, Neural Comput;15(6):1373–1396.
Bellman R (1957) Dynamic programming. Number ISBN 9780691079516. Princeton University Press
Box P, Hunter G (1978). Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building. John Wiley and Sons
Buckingham E (1914) On physically similar systems; illustrations of the use of dimensional equations, Physical Review 4, 345–376
Chen W, Jin R, Simpson T (2001) Comparative Studies of Metamodelling Techniques under Multiple Modeling Criteria
Currie N (2005) Conceptual Design: Building a Social Conscience, AIGA, November 1
Eppelt U, Al Khawli T (2014) Metamodeling of Laser Cutting, Presentation and Proceedings paper In: ICNAAM—12th International Conference of Numerical Analysis and Applied Mathematics, September 22–28, 2014, Rodos Palace Hotel, Rhodes, Greece, Preprint Fraunhofer ILT
Ferrari D, Borrotti M (2014), Response improvement in complex experiments by coinformation composite likelihood optimization. Statistics and Computing, Volume 24, Issue 3, pp 351–363
Franke R (1982), Smooth interpolation of scattered data by local thin plate splines, Comp. & Maths. with Appls. 8, 273
Gebhardt S, Al Khawli T, Hentschel B, Kuhlen T, Schulz W. (2013) Hyperslice Visualization of Metamodels for Manufacturing Processes, IEEE Visualization Conference (VIS): Atlanta, GA, USA, 13 Oct—18 Oct 2013
Gerber S, Bremer T, Pascucci V, Whitaker R (2010) Visual Exploration of High Dimensional Scalar Functions, IEEE Trans Vis Comput Graph. 16(6): 1271–1280
Hardy R (1990) Theory and applications of the multiquadric biharmonic method: 20 years of discovery 1968
Hardy R (1971) Multiquadric equations of topography and other irregular surfaces. Journal of Geophysical Research, 76(8):1905–1915
Haykin S (2009) S. Neural Networks and Learning Machines (3rd Edition), Prentice Hall
Jones D, Schonlau M, Welch W, (1998) Efficient Global Optimization of Expensive BlackBox Functions. Journal of Global Optimization Vol. 13, 455–492
Jurecka F. (2007) Robust Design Optimization Based on Metamodeling Techniques, Shaker Verlag
Kleijnen J P, Sanchez S M, Lucas T W, Cioppa T M (2005): State of the art review: A user’s guide to the brave new world of designing simulation experiments. INFORMS Journal on Computing 17(3), 263–289
Lophaven S, Nielsen H B, Søndergaard J (2002) Dace a MATLAB Kriging Toolbox Version 2.0
Micchelli C A (1986) Interpolation of scattered data: distance matrices and conditionally positive definite functions, Constr. Approx.2, 11
McLeod G (1998). Linking Business Object Analysis to a Model View Controller Based Design Architecture, Proceedings of the Third CAiSE/IFIP 8.1 International Workshop on Evaluation of Modeling Methods in Systems Analysis and Design EMMSAD’98, Pisa, Italy.
Mongillo M (2010) Choosing basis functions and shape parameters for radial basis function methods, Comput. Math. Appl. 24, pp. 99–120
Montgomery D C, Peck E,Vining G (2012). Introduction to linear regression analysis (Vol. 821). Wiley.
Orr M (1996): Introduction to radial basis function networks
Reinhard R, Al Khawli T, Eppelt U, Meisen T, Schilberg D, Schulz W, Jeschke S (2013) How Virtual Production Intelligence Can Improve LaserCutting Planning Processes, In: ICPR 22—Systems Modeling and Simulation (p. 122)
Reinhard R, Al Khawli T, Eppelt U, Meisen T, Schilberg D, Schulz W, Jeschke S (2014) The Contribution of Virtual Production Intelligence to Laser Cutting Planning Processes, In: Enabling Manufacturing Competitiveness and Economic Sustainability (pp. 117–123). Springer International Publishing. 2014
Rippa S (1999): An algorithm for selecting a good value for the parameter c in radial basis function interpolation, Adv Comput Math 11(2–3), 193–210
Sacks J, Welch W J, Mitchell T J, Wynn H (1989) Design and Analysis of Computer Experiments, Statistical Science, Vol. 4, No. 4, pp. 409–423
Schulz W, Simon G, Vicanek M (1986) Ablation of opaque surfaces due to laser radiation, J.Phys.D: Appl. Phys. 19 173–177
Schulz W, Simon G, Urbassek H, Decker I (1987) On laser fusion cutting of metals, J.Phys.D:Appl.Phys. 20 481–488
Schwarz G A (1978) Estimating the dimension of a mode. Statist. 1978;6(2):461–464
Sun M, Eppelt U, Russ S, Hartmann C, Siebert C, Zhu J, Schulz W (2013) Numerical analysis of laser ablation and damage in glass with multiple picosecond laser pulses. Optics express, 21(7), pp. 7858–7867
Acknowledgments
The investigations are partly supported by the German Research Association (DFG) within the Cluster of Excellence “Integrative Production Technology for HighWage Countries” at RWTH Aachen University as well as by the European Commission within the EUFP7Framework (project HALO, see http://www.haloproject.eu/).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
Copyright information
© 2015 The Author(s)
About this paper
Cite this paper
Schulz, W., Khawli, T.A. (2015). MetaModelling Techniques Towards Virtual Production Intelligence. In: Brecher, C. (eds) Advances in Production Technology. Lecture Notes in Production Engineering. Springer, Cham. https://doi.org/10.1007/9783319123042_6
Download citation
DOI: https://doi.org/10.1007/9783319123042_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 9783319123035
Online ISBN: 9783319123042
eBook Packages: EngineeringEngineering (R0)