Skip to main content

Metamodelle

  • Chapter
  • First Online:

Part of the book series: VDI-Buch ((VDI-BUCH))

Zusammenfassung

Der direkte Einsatz von komplexen Simulationsmodellen ist durch ihre langen Rechenzeiten nur eingeschränkt zur Analyse technischer Systeme möglich. Aus diesem Grund werden alternativ sogenannte Metamodelle (auch Transferfunktionen, Surrogate-, Approximations- oder Ersatzmodelle genannt) verwendet, welche mit deutlich geringeren Rechenzeiten und ausreichend genauen Ergebnissen das komplexe Simulationsmodell approximieren. Die Rechenzeiten von Metamodellen liegen dabei im Bereich von Millisekunden, wobei die ursprünglichen Modelle teilweise Stunden oder Tage für die Berechnung eines einzelnen Ergebnisses benötigen.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

  1. Akaike, H. A new look at the statistical identification model. IEEE Transactions on Automatic Control 19 (1974), 716723.

    Google Scholar 

  2. An, J.; owen, A. Quasi-regression. J. Complexity 17 (2001), 588–607.

    Article  MATH  MathSciNet  Google Scholar 

  3. Bozdogan, H. Model selection and akaikes information criterion (aic): The general theory and its analytical extensions. Psychometrika 52 (1987), 346–370.

    Article  Google Scholar 

  4. Brent, R. P. Algorithms for Minimisation Without Derivatives. Prentice Hall, 1973.

    Google Scholar 

  5. Cleveland, W. Robust locally weighted regression and smoothing scatter plots. J. Amer. Stat. Assoc. 74 (1979), 829–836.

    Article  MATH  MathSciNet  Google Scholar 

  6. Cressie, N. Statistics for Spatial Data. Wiley, New York, 1993.

    Google Scholar 

  7. Draper, N. R.; Smith, H. Applied Regression Analysis, 3rd edition. Wiley, 1998.

    Google Scholar 

  8. Duong, T.; Hazelton, M. Plug-in bandwidth selectors for bivariate kernel density estimation. Journal of Nonparametric Statistics 15 (2003), 17–30.

    Article  MATH  MathSciNet  Google Scholar 

  9. Efron, B.; Hastie, T.; Johnstone, L.; Tibshirani, R. Least angle regression. Annals of Statistics 32 (2002), 407–499.

    MathSciNet  Google Scholar 

  10. Efroymson, M. Mathematical Methods for Digital Computers. John Wiley & Sons Inc, 1960, ch. Multiple regression analysis.

    Google Scholar 

  11. Fang, K.-T.; Li, R.; Sudjianto, A. Design and Modeling for Computer Experiments (Computer Science & Data Analysis). Chapman & Hall/CRC, 2005.

    Google Scholar 

  12. Frank, I. E.; Friedman, J. H. A statistical view of some chemometrics regression tools. Technometrics 35 (1993), 109–148.

    Article  MATH  Google Scholar 

  13. Furnival, G.; Wilson, R. Regression by leaps and bounds. Technometrics 16 (1974), 499–511.

    Article  MATH  Google Scholar 

  14. Hastie, T.; Loader, C. Local regression: automatic kernel carpentry (with discussion). Statistical Science 8 (1993), 120–143.

    Article  Google Scholar 

  15. Hesterberg, T.; Choi, N. H.; Meier, L.; Fraley, C. Least angle and l1 penalized regression: A review. Statistics Surveys 2 (2008), 61–93.

    Article  MathSciNet  Google Scholar 

  16. Hoerl, A. E.; Kennard, R. W. Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12 (1970), 55–67.

    Article  MATH  Google Scholar 

  17. Ishiguro, M.; Sakamoto, Y.; Kitagawa, G. Bootstrapping log likelihood and eic, an extension of aic. Annals of the Institute of Statistical Mathematics 49 (1997), 411–434.

    Article  MATH  MathSciNet  Google Scholar 

  18. J.H., F. Multivariate adaptive regression splines (with discussion). Annals of Statistics 19 (1991), 1–141.

    Google Scholar 

  19. J.H., F. Fast mars (technical report: 110). Tech. rep., Stanford University Department of Statistics, 1993.

    Google Scholar 

  20. Jones, M.; Marron, J.; Sheather, S. A brief survey of bandwidth selection for density estimation. Journal of the American Statistical Association 91 (1996), 401–407.

    Article  MATH  MathSciNet  Google Scholar 

  21. King, M. L.; Zhang, X.; Hyndman, R. J. Bandwidth selection for multivariate kernel density estimation using mcmc. Computational Statistics and Data Analysis 50 (2004), 3009–3031.

    MathSciNet  Google Scholar 

  22. Krige, D. A statistical approach to some mine valuations and allied problems at the witwatersrand. Master’s thesis, University of Witwatersrand, 1951.

    Google Scholar 

  23. Li, R.; Sudjianto, A. Analysis of computer experiments using penalized likelihood in gaussian kriging models. Technometrics 47 (2005), 111–120.

    Article  MathSciNet  Google Scholar 

  24. Mallows, C. Some comments on cp. Technometrics 42 (2000), 87–94.

    Article  Google Scholar 

  25. Marron, J. S. A comparison of cross-validation techniques in density estimation. The Annals of Statistc 15 (1987), 152–162.

    Article  MATH  MathSciNet  Google Scholar 

  26. Masters. Practical Neural Network Recipes in C++, book & disk 1st ed. Morgan Kaufmann, April 1993.

    Google Scholar 

  27. Matheron, G. Principles of geostatistics. Economic Geology 58 (1963), 1246–1266.

    Article  Google Scholar 

  28. Miller, A. Subset Selection in Regression, Second Editon. CRC Press Inc, 2002.

    Google Scholar 

  29. Montgomery, D. C. Design and Analysis of Experiments. John Wiley, New York, 2001.

    Google Scholar 

  30. Nadaraya, E. On estimating regression. Theory Probab. Appl. 9 (1964), 141–142.

    Article  Google Scholar 

  31. Nadaraya, E. On non-parametric estimates of density functions and regression curves. Theory Probab. Appl. 10 (1965), 186–190.

    Article  MATH  Google Scholar 

  32. Powell, M. J. D. Algorithms for approximation. Clarendon Press, New York, NY, USA, 1987, ch. Radial basis functions for multivariable interpolation: a review, pp. 143–167.

    Google Scholar 

  33. Press, W. H.; Teukolsky, S. A.; Vetterling, W. T.; Flannery, B. P. Numerical Recipes: The Art of Scientific Computing. Cambridge University Press, 2007.

    Google Scholar 

  34. Rasmussen, C. E.; Williams, C. K. I. Gaussian Processes for Machine Learning. The MIT Press, 2006.

    Google Scholar 

  35. Rissanen, J. Modelling by shortest data description. Automatica 14 (1978), 465–471.

    Article  MATH  Google Scholar 

  36. Rodrguez, C. C. The abc of model selection: Aic, bic and the new cic. In Bayesian Inference and Maximum Entropy Methods in Science and Engineering (November 2005), K. H. Knuth, A. E. Abbas, R. D. Morris, and J. P. Castle, Eds., vol. 803 of American Institute of Physics Conference Series, pp. 80–87.

    Google Scholar 

  37. Sachs, L.; Hedderich, J. Angewandte Statistik, Methodensammlung mit R. Springer-Verlag Berlin Heidelberg, 2009.

    Google Scholar 

  38. Sacks, J.; Welch, W. J.; Mitchell, T. J.; Wynn, H. P. Design and analysis of computer experiments. Statistical Sience 4 (1989), 409–423.

    Article  MATH  MathSciNet  Google Scholar 

  39. Sain, S.; Baggerly, K.; Scott, D. Cross-validation of multivariate densities. Journal of the American Statistical Association 89 (1994), 807–817.

    Article  MATH  MathSciNet  Google Scholar 

  40. Schwarz, G. Estimating the dimension of a model. The Annals of Statistics 6 (1978), 461–464.

    Article  MATH  MathSciNet  Google Scholar 

  41. Simpson, T.; Lin, D.; Chen, W. Sampling strategies for computer experiments: design and analysis. International Journal of Reliability and Safety (IJRS) 2, 3 (2001), 209–240. David:001.

    Google Scholar 

  42. Spiegelhalter, D.; Best, N.; Carlin, B.; van der Linde, A. Bayesian measures of model complexity and fit (with discussion). Journal of the Royal Statistical Society 64 (2002), 583–616.

    Article  MATH  Google Scholar 

  43. st, G. H. Kriging by local polynomials. Computational Statistics and Data Analysis 29, 3 (January 1999), 295–312.

    Google Scholar 

  44. Stein, M. L. Interpolation of Spatial Data - Some Theory for Kriging. Springer, 1999.

    Google Scholar 

  45. T. Hastie, R. T.; Friedman, J. The Elements of Statistical Learning Data Mining, Inference and Prediction. Springer, 2001.

    Google Scholar 

  46. Tibshirani, R. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society 58 (1996), 267–288.

    MATH  MathSciNet  Google Scholar 

  47. Tibshirani, R. The lasso method for variable selection in the cox model. Statistics in medicine 16 (1997), 385–395.

    Article  Google Scholar 

  48. Wand, M.; M.C.Jones. Multivariate plug-in bandwidth selection. Computational Statistics 9 (1994), 97–116.

    MATH  MathSciNet  Google Scholar 

  49. Watson, G. Smooth regression analysis. Sankhaya: The Indian Journal of Statistics A 26 (1964), 359–372.

    MATH  Google Scholar 

  50. Wermuth, N. Beobachtungen zur ridge-regression. Jahrbcher fr Nationalkonomie und Statistik 189 (1975), 300–307.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Karl Siebertz .

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Siebertz, K., van Bebber, D., Hochkirchen, T. (2010). Metamodelle. In: Statistische Versuchsplanung. VDI-Buch(). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05493-8_8

Download citation

Publish with us

Policies and ethics