Advertisement

Statistics and Computing

, Volume 27, Issue 1, pp 219–236 | Cite as

Point process-based Monte Carlo estimation

  • Clément WalterEmail author
Article

Abstract

This paper addresses the issue of estimating the expectation of a real-valued random variable of the form \(X = g(\mathbf {U})\) where g is a deterministic function and \(\mathbf {U}\) can be a random finite- or infinite-dimensional vector. Using recent results on rare event simulation, we propose a unified framework for dealing with both probability and mean estimation for such random variables, i.e. linking algorithms such as Tootsie Pop Algorithm or Last Particle Algorithm with nested sampling. Especially, it extends nested sampling as follows: first the random variable X does not need to be bounded any more: it gives the principle of an ideal estimator with an infinite number of terms that is unbiased and always better than a classical Monte Carlo estimator—in particular it has a finite variance as soon as there exists \(k \in \mathbb {R}> 1\) such that \({\text {E}}\left[ X^k \right] < \infty \). Moreover we address the issue of nested sampling termination and show that a random truncation of the sum can preserve unbiasedness while increasing the variance only by a factor up to 2 compared to the ideal case. We also build an unbiased estimator with fixed computational budget which supports a Central Limit Theorem and discuss parallel implementation of nested sampling, which can dramatically reduce its running time. Finally we extensively study the case where X is heavy-tailed.

Keywords

Nested sampling Central limit theorem  Heavy tails Rare event simulation Last particle algorithm 

Notes

Acknowledgments

The author would like to thank his advisors Josselin Garnier (University Paris Diderot) and Gilles Defaux (Commissariat à l’Energie Atomique et aux Energies Alternatives) for their advices and suggestions as well as the reviewers for their very relevant comments which helped improving the manuscript. This work was partially supported by ANR project Chorus.

References

  1. Au, S.K., Beck, J.L.: Estimation of small failure probabilities in high dimensions by subset simulation. Probab. Eng. Mech. 16(4), 263–277 (2001)CrossRefGoogle Scholar
  2. Beirlant, J., Caeiro, F., Gomes, M.I.: An overview and open research topics in statistics of univariate extremes. REVSTAT—Stat. J. 10(1), 1–31 (2012)MathSciNetzbMATHGoogle Scholar
  3. Bernardo, J.M., Bayarri, M., Berger, J.O., Dawid, A.P., Heckerman, D.: Bayesian Statistics, vol. 9. Oxford University Press, Oxford (2011)zbMATHGoogle Scholar
  4. Botev, Z.I., Kroese, D.P.: Efficient Monte Carlo simulation via the generalized splitting method. Stat. Comput. 22(1), 1–16 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  5. Brewer, B.J., Pártay, L.B., Csányi, G.: Diffusive nested sampling. Stat. Comput. 21(4), 649–656 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  6. Cérou, F., Guyader, A.: Adaptive multilevel splitting for rare event analysis. Stoch. Anal. Appl. 25(2), 417–443 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  7. Cérou, F., Del Moral, P., Furon, T., Guyader, A., et al.: Rare event simulation for a static distribution. In: Proceedings of RESIM 2008. http://www.irisa.fr/aspi/fcerou/Resim_Cerou_et_al.pdf (2009)
  8. Cérou, F., Del Moral, P., Furon, T., Guyader, A.: Sequential Monte Carlo for rare event estimation. Stat. Comput. 22(3), 795–808 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  9. Chopin, N., Robert, C.P.: Properties of nested sampling. Biometrika 97, 741 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  10. Corless, R.M., Gonnet, G.H., Hare, D.E., Jeffrey, D.J., Knuth, D.E.: On the Lambert W function. Adv. Comput. math. 5(1), 329–359 (1996)MathSciNetCrossRefzbMATHGoogle Scholar
  11. Embrechts, P., Klüppelberg, C., Mikosch, T.: Modelling Extremal Events: For Insurance and Finance, vol. 33. Springer, New York (1997)CrossRefzbMATHGoogle Scholar
  12. Evans, M.: Discussion of nested sampling for Bayesian computations by John Skilling. Bayesian Stat. 8, 491–524 (2007)Google Scholar
  13. Garvels, M.J.J.: The Splitting Method in Rare Event Simulation. Universiteit Twente, Enschede (2000)zbMATHGoogle Scholar
  14. Giles, M.B.: Multilevel Monte Carlo path simulation. Oper. Res. 56(3), 607–617 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  15. Glynn, P.W., Iglehart, D.L.: Importance sampling for stochastic simulations. Manag. Sci. 35(11), 1367–1392 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  16. Glynn, P.W., Whitt, W.: The asymptotic efficiency of simulation estimators. Oper. Res. 40(3), 505–520 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  17. Guyader, A., Hengartner, N., Matzner-Løber, E.: Simulation and estimation of extreme quantiles and extreme probabilities. Appl. Math. Optim. 64(2), 171–196 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  18. Hill, J.B.: Robust estimation for average treatment effects. Available at SSRN 2260573 (2013). doi: 10.2139/ssrn.2260573
  19. Huber, M., Schott, S., et al.: Using TPA for Bayesian inference. Bayesian Stat. 9(9), 257 (2011)MathSciNetGoogle Scholar
  20. Huber, M., Schott, S., et al.: Random construction of interpolating sets for high-dimensional integration. J. Appl. Probab. 51(1), 92–105 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  21. Johansson, J.: Estimating the mean of heavy-tailed distributions. Extremes 6(2), 91–109 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  22. Kahn, H., Harris, T.E.: Estimation of particle transmission by random sampling. Natl. Bur. Stand. Appl. Math. Ser. 12, 27–30 (1951)Google Scholar
  23. Keeton, C.R.: On statistical uncertainty in nested sampling. Mon. Not. R. Astron. Soc. 414(2), 1418–1426 (2011)CrossRefGoogle Scholar
  24. Martiniani, S., Stevenson, J.D., Wales, D.J., Frenkel, D.: Superposition enhanced nested sampling. Phys. Rev. X 4(3), 031,034 (2014)Google Scholar
  25. McLeish, D.: A general method for debiasing a Monte Carlo estimator. Monte Carlo Methods Appl. (2011)Google Scholar
  26. Mukherjee, P., Parkinson, D., Liddle, A.R.: A nested sampling algorithm for cosmological model selection. Astrophys. J. Lett. 638(2), L51 (2006)CrossRefGoogle Scholar
  27. Necir, A., Rassoul, A., Zitikis, R.: Estimating the conditional tail expectation in the case of heavy-tailed losses. J. Probab. Stat. 2010 (2010). doi: 10.1155/2010/596839
  28. Peng, L.: Estimating the mean of a heavy tailed distribution. Stat. Probab. Lett. 52(3), 255–264 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  29. Propp, J.G., Wilson, D.B.: Exact sampling with coupled markov chains and applications to statistical mechanics. Random Struct. Algorithms 9(1–2), 223–252 (1996)MathSciNetCrossRefzbMATHGoogle Scholar
  30. Rhee, C.H., Glynn, P.W.: Unbiased estimation with square root convergence for sde models. Oper. Res. 63(5), 1026–1043 (2015). doi: 10.1287/opre.2015.1404
  31. Roberts, G.: Comments on using TPA for Bayesian inference, by Huber, M. and Schott, S. In: Bernardo, J.M., Bayarri, M.J., Berger, J.O., Dawid, A.P., Heckerman, D., Smith, A.F.M., West, M. (eds.) Bayesian Statistics, vol. 9, pp. 257–282. Oxford University Press, Oxford (2011)Google Scholar
  32. Robert, C.P., Casella, G.: Monte Carlo Statistical Methods. Springer, New York (2004)CrossRefzbMATHGoogle Scholar
  33. Simonnet, E.: Combinatorial analysis of the adaptive last particle method. Stat. Comput., pp. 1–20 (2014)Google Scholar
  34. Skilling, J.: Nested sampling for general Bayesian computation. Bayesian Anal. 1(4), 833–859 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  35. Vergé, C., Dubarry, C., Del Moral, P., Moulines, E.: On parallel implementation of sequential Monte Carlo methods: the island particle model. Stat. Comput., pp 1–18 (2013)Google Scholar
  36. Walter, C.: Moving particles: a parallel optimal multilevel splitting method with application in quantiles estimation and meta-model based algorithms. Struct. Saf. 55, 10–25 (2015)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.CEA, DAM, DIFArpajonFrance
  2. 2.Laboratoire de Probabilités et Modèles AléatoiresUniversité Paris DiderotParisFrance

Personalised recommendations