Zusammenfassung
Der direkte Einsatz von komplexen Simulationsmodellen ist durch ihre langen Rechenzeiten nur eingeschränkt zur Analyse technischer Systeme möglich. Aus diesem Grund werden alternativ sogenannte Metamodelle (auch Transferfunktionen, Surrogate-, Approximations- oder Ersatzmodelle genannt) verwendet, welche mit deutlich geringeren Rechenzeiten und ausreichend genauen Ergebnissen das komplexe Simulationsmodell approximieren. Die Rechenzeiten von Metamodellen liegen dabei im Bereich von Millisekunden, wobei die ursprünglichen Modelle teilweise Stunden oder Tage für die Berechnung eines einzelnen Ergebnisses benötigen.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Akaike, H. A new look at the statistical identification model. IEEE Transactions on Automatic Control 19 (1974), 716723.
An, J.; owen, A. Quasi-regression. J. Complexity 17 (2001), 588–607.
Bozdogan, H. Model selection and akaikes information criterion (aic): The general theory and its analytical extensions. Psychometrika 52 (1987), 346–370.
Brent, R. P. Algorithms for Minimisation Without Derivatives. Prentice Hall, 1973.
Cleveland, W. Robust locally weighted regression and smoothing scatter plots. J. Amer. Stat. Assoc. 74 (1979), 829–836.
Cressie, N. Statistics for Spatial Data. Wiley, New York, 1993.
Draper, N. R.; Smith, H. Applied Regression Analysis, 3rd edition. Wiley, 1998.
Duong, T.; Hazelton, M. Plug-in bandwidth selectors for bivariate kernel density estimation. Journal of Nonparametric Statistics 15 (2003), 17–30.
Efron, B.; Hastie, T.; Johnstone, L.; Tibshirani, R. Least angle regression. Annals of Statistics 32 (2002), 407–499.
Efroymson, M. Mathematical Methods for Digital Computers. John Wiley & Sons Inc, 1960, ch. Multiple regression analysis.
Fang, K.-T.; Li, R.; Sudjianto, A. Design and Modeling for Computer Experiments (Computer Science & Data Analysis). Chapman & Hall/CRC, 2005.
Frank, I. E.; Friedman, J. H. A statistical view of some chemometrics regression tools. Technometrics 35 (1993), 109–148.
Furnival, G.; Wilson, R. Regression by leaps and bounds. Technometrics 16 (1974), 499–511.
Hastie, T.; Loader, C. Local regression: automatic kernel carpentry (with discussion). Statistical Science 8 (1993), 120–143.
Hesterberg, T.; Choi, N. H.; Meier, L.; Fraley, C. Least angle and l1 penalized regression: A review. Statistics Surveys 2 (2008), 61–93.
Hoerl, A. E.; Kennard, R. W. Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12 (1970), 55–67.
Ishiguro, M.; Sakamoto, Y.; Kitagawa, G. Bootstrapping log likelihood and eic, an extension of aic. Annals of the Institute of Statistical Mathematics 49 (1997), 411–434.
J.H., F. Multivariate adaptive regression splines (with discussion). Annals of Statistics 19 (1991), 1–141.
J.H., F. Fast mars (technical report: 110). Tech. rep., Stanford University Department of Statistics, 1993.
Jones, M.; Marron, J.; Sheather, S. A brief survey of bandwidth selection for density estimation. Journal of the American Statistical Association 91 (1996), 401–407.
King, M. L.; Zhang, X.; Hyndman, R. J. Bandwidth selection for multivariate kernel density estimation using mcmc. Computational Statistics and Data Analysis 50 (2004), 3009–3031.
Krige, D. A statistical approach to some mine valuations and allied problems at the witwatersrand. Master’s thesis, University of Witwatersrand, 1951.
Li, R.; Sudjianto, A. Analysis of computer experiments using penalized likelihood in gaussian kriging models. Technometrics 47 (2005), 111–120.
Mallows, C. Some comments on cp. Technometrics 42 (2000), 87–94.
Marron, J. S. A comparison of cross-validation techniques in density estimation. The Annals of Statistc 15 (1987), 152–162.
Masters. Practical Neural Network Recipes in C++, book & disk 1st ed. Morgan Kaufmann, April 1993.
Matheron, G. Principles of geostatistics. Economic Geology 58 (1963), 1246–1266.
Miller, A. Subset Selection in Regression, Second Editon. CRC Press Inc, 2002.
Montgomery, D. C. Design and Analysis of Experiments. John Wiley, New York, 2001.
Nadaraya, E. On estimating regression. Theory Probab. Appl. 9 (1964), 141–142.
Nadaraya, E. On non-parametric estimates of density functions and regression curves. Theory Probab. Appl. 10 (1965), 186–190.
Powell, M. J. D. Algorithms for approximation. Clarendon Press, New York, NY, USA, 1987, ch. Radial basis functions for multivariable interpolation: a review, pp. 143–167.
Press, W. H.; Teukolsky, S. A.; Vetterling, W. T.; Flannery, B. P. Numerical Recipes: The Art of Scientific Computing. Cambridge University Press, 2007.
Rasmussen, C. E.; Williams, C. K. I. Gaussian Processes for Machine Learning. The MIT Press, 2006.
Rissanen, J. Modelling by shortest data description. Automatica 14 (1978), 465–471.
Rodrguez, C. C. The abc of model selection: Aic, bic and the new cic. In Bayesian Inference and Maximum Entropy Methods in Science and Engineering (November 2005), K. H. Knuth, A. E. Abbas, R. D. Morris, and J. P. Castle, Eds., vol. 803 of American Institute of Physics Conference Series, pp. 80–87.
Sachs, L.; Hedderich, J. Angewandte Statistik, Methodensammlung mit R. Springer-Verlag Berlin Heidelberg, 2009.
Sacks, J.; Welch, W. J.; Mitchell, T. J.; Wynn, H. P. Design and analysis of computer experiments. Statistical Sience 4 (1989), 409–423.
Sain, S.; Baggerly, K.; Scott, D. Cross-validation of multivariate densities. Journal of the American Statistical Association 89 (1994), 807–817.
Schwarz, G. Estimating the dimension of a model. The Annals of Statistics 6 (1978), 461–464.
Simpson, T.; Lin, D.; Chen, W. Sampling strategies for computer experiments: design and analysis. International Journal of Reliability and Safety (IJRS) 2, 3 (2001), 209–240. David:001.
Spiegelhalter, D.; Best, N.; Carlin, B.; van der Linde, A. Bayesian measures of model complexity and fit (with discussion). Journal of the Royal Statistical Society 64 (2002), 583–616.
st, G. H. Kriging by local polynomials. Computational Statistics and Data Analysis 29, 3 (January 1999), 295–312.
Stein, M. L. Interpolation of Spatial Data - Some Theory for Kriging. Springer, 1999.
T. Hastie, R. T.; Friedman, J. The Elements of Statistical Learning Data Mining, Inference and Prediction. Springer, 2001.
Tibshirani, R. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society 58 (1996), 267–288.
Tibshirani, R. The lasso method for variable selection in the cox model. Statistics in medicine 16 (1997), 385–395.
Wand, M.; M.C.Jones. Multivariate plug-in bandwidth selection. Computational Statistics 9 (1994), 97–116.
Watson, G. Smooth regression analysis. Sankhaya: The Indian Journal of Statistics A 26 (1964), 359–372.
Wermuth, N. Beobachtungen zur ridge-regression. Jahrbcher fr Nationalkonomie und Statistik 189 (1975), 300–307.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Siebertz, K., van Bebber, D., Hochkirchen, T. (2010). Metamodelle. In: Statistische Versuchsplanung. VDI-Buch(). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05493-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-642-05493-8_8
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-05492-1
Online ISBN: 978-3-642-05493-8
eBook Packages: Computer Science and Engineering (German Language)