Bayesian Active Learning for Sensitivity Analysis

  • Tobias Pfingsten
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4212)


Designs of micro electro-mechanical devices need to be robust against fluctuations in mass production. Computer experiments with tens of parameters are used to explore the behavior of the system, and to compute sensitivity measures as expectations over the input distribution. Monte Carlo methods are a simple approach to estimate these integrals, but they are infeasible when the models are computationally expensive. Using a Gaussian processes prior, expensive simulation runs can be saved. This Bayesian quadrature allows for an active selection of inputs where the simulation promises to be most valuable, and the number of simulation runs can be reduced further.

We present an active learning scheme for sensitivity analysis which is rigorously derived from the corresponding Bayesian expected loss. On three fully featured, high dimensional physical models of electro-mechanical sensors, we show that the learning rate in the active learning scheme is significantly better than for passive learning.


Monte Carlo Gaussian Process Generalization Error Input Distribution Gaussian Process Regression 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Saltelli, A., Chan, K., Scott, E.M.: Sensitivity Analysis. Wiley, Chichester (2000)zbMATHGoogle Scholar
  2. 2.
    Pfingsten, T., Herrmann, D.J., Rasmussen, C.E.: Model-based design analysis and optimization. IEEE Trans. on Semiconductor Manufacturing (in revision) (2006)Google Scholar
  3. 3.
    O’Hagan, A.: Bayes-Hermite Quadrature. Journal of Statistical Planning and Inference 29(3), 245–260 (1991)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Oakley, J.E., O’Hagan, A.: Probabilistic sensitivity analysis of complex models: a Bayesian approach. Journal of the Royal Statistical Society, Series B 66(3), 751–769 (2004)zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Rasmussen, C.E., Ghahramani, Z.: Bayesian Monte Carlo. In: NIPS, vol. 15 (2003)Google Scholar
  6. 6.
    Lindley, D.V.: On the measure of information provided by an experiment. Ann. math. Statist. 27, 986–1005 (1956)zbMATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Berger, J.O.: Statistical Decision Theory and Bayesian Analysis. Springer, New York (1985)zbMATHGoogle Scholar
  8. 8.
    Chaloner, K., Verdinelli, I.: Bayesian experimental design: A review. Statistical Science 10(3), 273–304 (1995)zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Lindley, D.V.: Bayesian Statistics — A Review. SIAM, Philadelphia (1972)Google Scholar
  10. 10.
    Mackay, D.: Information-based objective functions for active data selection. Neural Computation 4(4), 589–603 (1992)CrossRefGoogle Scholar
  11. 11.
    O’Hagan, A.: Curve Fitting and Optimal Design for Prediction. J.R. Statist. Soc. B 40(1), 1–42 (1978)zbMATHMathSciNetGoogle Scholar
  12. 12.
    Ko, C.W., Lee, J., Queyranne, M.: An exact algorithm for maximum entropy sampling. Operations Research 43(4), 684–691 (1995)zbMATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Lindley, D.: The choice of variables in multiple regression. Journal of the Royal Statistical Society B 30(1), 31–66 (1968)MathSciNetGoogle Scholar
  14. 14.
    Niederreiter, H.: Random number generation and quasi-Monte Carlo methods. SIAM, Philadelphia (1992)zbMATHGoogle Scholar
  15. 15.
    O’Hagan, A.: Monte Carlo is Fundamentally Unsound. The Statistician 36(2/3), 247–249 (1987)CrossRefGoogle Scholar
  16. 16.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  17. 17.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Statistical Science 4(4), 409–423 (1989)zbMATHCrossRefMathSciNetGoogle Scholar
  18. 18.
    Welch, W.J., Buck, R.J., Sacks, J.S., Wynn, H.P., Mitchell, T.J., Morris, M.D.: Screening, prediction, and computer experiments. Technometrics 34(1), 15–25 (1992)CrossRefGoogle Scholar
  19. 19.
    Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer, New York (2003)zbMATHGoogle Scholar
  20. 20.
    Mackay, M.D., Beckmann, R.J., Conover, W.J.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2), 239–245 (1979)CrossRefMathSciNetGoogle Scholar
  21. 21.
    Johnson, M.E., Ylvisaker, D., Moore, L.: Minimax and maximin distance designs. J. of Statistical Planning and Inference 26, 131–148 (1990)CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Tobias Pfingsten
    • 1
    • 2
  1. 1.Robert Bosch GmbHStuttgartGermany
  2. 2.Max Planck Institute for Bioligical Cybernetics 

Personalised recommendations