Advertisement

Adaptive Design of Experiments for Sobol Indices Estimation Based on Quadratic Metamodel

  • Evgeny Burnaev
  • Ivan PaninEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9047)

Abstract

Sensitivity analysis aims to identify which input parameters of a given mathematical model are the most important. One of the well-known sensitivity metrics is the Sobol sensitivity index. There is a number of approaches to Sobol indices estimation. In general, these approaches can be divided into two groups: Monte Carlo methods and methods based on metamodeling. Monte Carlo methods have well-established mathematical apparatus and statistical properties. However, they require a lot of model runs. Methods based on metamodeling allow to reduce a required number of model runs, but may be difficult for analysis. In this work, we focus on metamodeling approach for Sobol indices estimation, and particularly, on the initial step of this approach — design of experiments. Based on the concept of D-optimality, we propose a method for construction of an adaptive experimental design, effective for calculation of Sobol indices from a quadratic metamodel. Comparison of the proposed design of experiments with other methods is performed.

Keywords

Active learning Global sensitivity analysis Sobol indices Adaptive design of experiments D-optimality 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Saltelli, A., Chan, K., Scott, M.: Sensitivity analysis. Probability and statistics series. Wiley, West Sussex (2000)zbMATHGoogle Scholar
  2. 2.
    Cukier, R.I., Levine, H.B., Shuler, K.E.: Nonlinear sensitivity analysis of multiparameter model systems. Journal of Computational Physics 26(1), 142 (1978)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Sobol, I.: Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates. Mathematics and Computers in Simulation 55(1–3) (2001)Google Scholar
  4. 4.
    Chen, W., Jin, R.: Analytical variance-based global sensitivity analysis in simulation-based design under uncertainty. Journal of Mechanical Design 127(5) (2005)Google Scholar
  5. 5.
    Hastie, T., Tibshirani, R., Friedman, J.: Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, New York (2009)CrossRefGoogle Scholar
  6. 6.
    Grimmett, G., Stirzaker, D.: Probability and Random Processes, Oxford (2001)Google Scholar
  7. 7.
    Oehlert, G.W.: A Note on the Delta Method, The American Statistician 46 (1992)Google Scholar
  8. 8.
    Chaloner, K., Verdinelli, I.: Bayesian Experimental Design: A Review. Statistical Science 10, 273–304 (1995)CrossRefzbMATHMathSciNetGoogle Scholar
  9. 9.
    Miller, A.J., Nguyen, N.-K.: Algorithm AS 295: A Fedorov Exchange Algorithm for D-Optimal Design. Journal of the Royal Statistical Society. Series C (Applied Statistics) 43(4), 669–677 (1994)Google Scholar
  10. 10.
    Burnaev, E., Panov, M.: Adaptive design of experiments based on gaussian processes. To Appear in Proceedings of the Third International Symposium on Learning and Data Sciences (SLDS 2015), London, England, UK, April 20–22 (2015)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Moscow Institute of Physics and TechnologyMoscowRussia
  2. 2.Datadvance llc.MoscowRussia
  3. 3.Kharkevich Institute for Information Transmission ProblemsMoscowRussia

Personalised recommendations