Bayesian Active Learning for Sensitivity Analysis

  • Tobias Pfingsten
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4212)

Abstract

Designs of micro electro-mechanical devices need to be robust against fluctuations in mass production. Computer experiments with tens of parameters are used to explore the behavior of the system, and to compute sensitivity measures as expectations over the input distribution. Monte Carlo methods are a simple approach to estimate these integrals, but they are infeasible when the models are computationally expensive. Using a Gaussian processes prior, expensive simulation runs can be saved. This Bayesian quadrature allows for an active selection of inputs where the simulation promises to be most valuable, and the number of simulation runs can be reduced further.

We present an active learning scheme for sensitivity analysis which is rigorously derived from the corresponding Bayesian expected loss. On three fully featured, high dimensional physical models of electro-mechanical sensors, we show that the learning rate in the active learning scheme is significantly better than for passive learning.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Saltelli, A., Chan, K., Scott, E.M.: Sensitivity Analysis. Wiley, Chichester (2000)MATHGoogle Scholar
  2. 2.
    Pfingsten, T., Herrmann, D.J., Rasmussen, C.E.: Model-based design analysis and optimization. IEEE Trans. on Semiconductor Manufacturing (in revision) (2006)Google Scholar
  3. 3.
    O’Hagan, A.: Bayes-Hermite Quadrature. Journal of Statistical Planning and Inference 29(3), 245–260 (1991)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Oakley, J.E., O’Hagan, A.: Probabilistic sensitivity analysis of complex models: a Bayesian approach. Journal of the Royal Statistical Society, Series B 66(3), 751–769 (2004)MATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Rasmussen, C.E., Ghahramani, Z.: Bayesian Monte Carlo. In: NIPS, vol. 15 (2003)Google Scholar
  6. 6.
    Lindley, D.V.: On the measure of information provided by an experiment. Ann. math. Statist. 27, 986–1005 (1956)MATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Berger, J.O.: Statistical Decision Theory and Bayesian Analysis. Springer, New York (1985)MATHGoogle Scholar
  8. 8.
    Chaloner, K., Verdinelli, I.: Bayesian experimental design: A review. Statistical Science 10(3), 273–304 (1995)MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Lindley, D.V.: Bayesian Statistics — A Review. SIAM, Philadelphia (1972)Google Scholar
  10. 10.
    Mackay, D.: Information-based objective functions for active data selection. Neural Computation 4(4), 589–603 (1992)CrossRefGoogle Scholar
  11. 11.
    O’Hagan, A.: Curve Fitting and Optimal Design for Prediction. J.R. Statist. Soc. B 40(1), 1–42 (1978)MATHMathSciNetGoogle Scholar
  12. 12.
    Ko, C.W., Lee, J., Queyranne, M.: An exact algorithm for maximum entropy sampling. Operations Research 43(4), 684–691 (1995)MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Lindley, D.: The choice of variables in multiple regression. Journal of the Royal Statistical Society B 30(1), 31–66 (1968)MathSciNetGoogle Scholar
  14. 14.
    Niederreiter, H.: Random number generation and quasi-Monte Carlo methods. SIAM, Philadelphia (1992)MATHGoogle Scholar
  15. 15.
    O’Hagan, A.: Monte Carlo is Fundamentally Unsound. The Statistician 36(2/3), 247–249 (1987)CrossRefGoogle Scholar
  16. 16.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)MATHGoogle Scholar
  17. 17.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Statistical Science 4(4), 409–423 (1989)MATHCrossRefMathSciNetGoogle Scholar
  18. 18.
    Welch, W.J., Buck, R.J., Sacks, J.S., Wynn, H.P., Mitchell, T.J., Morris, M.D.: Screening, prediction, and computer experiments. Technometrics 34(1), 15–25 (1992)CrossRefGoogle Scholar
  19. 19.
    Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer, New York (2003)MATHGoogle Scholar
  20. 20.
    Mackay, M.D., Beckmann, R.J., Conover, W.J.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2), 239–245 (1979)CrossRefMathSciNetGoogle Scholar
  21. 21.
    Johnson, M.E., Ylvisaker, D., Moore, L.: Minimax and maximin distance designs. J. of Statistical Planning and Inference 26, 131–148 (1990)CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Tobias Pfingsten
    • 1
    • 2
  1. 1.Robert Bosch GmbHStuttgartGermany
  2. 2.Max Planck Institute for Bioligical Cybernetics 

Personalised recommendations