Statistics and Computing

, Volume 28, Issue 3, pp 633–652 | Cite as

Without-replacement sampling for particle methods on finite state spaces

  • Rohan ShahEmail author
  • Dirk P. Kroese


Combinatorial estimation is a new area of application for sequential Monte Carlo methods. We use ideas from sampling theory to introduce new without-replacement sampling methods in such discrete settings. These without-replacement sampling methods allow the addition of merging steps, which can significantly improve the resulting estimators. We give examples showing the use of the proposed methods in combinatorial rare-event probability estimation and in discrete state-space models.


Sequential Monte Carlo Sampling theory Rare-event simulation Network reliability 



This work was supported by the Australian Research Council Centre of Excellence for Mathematical & Statistical Frontiers, under grant number CE140100049. The authors would like to thank the reviewers for their valuable comments, which improved the quality of this paper.


  1. Aires, N.: Comparisons between conditional Poisson sampling and Pareto \(\pi \)ps sampling designs. J. Stat. Plan. Inference 88(1), 133–147 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  2. Bondesson, L., Traat, I., Lundqvist, A.: Pareto sampling versus Sampford and conditional Poisson sampling. Scand. J. Stat. 33(4), 699–720 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  3. Brewer, K.R.W., Hanif, M.: Sampling with Unequal Probabilities, vol. 15. Springer, New York (1983)zbMATHGoogle Scholar
  4. Brockwell, A., Del Moral, P., Doucet, A.: Sequentially interacting Markov chain Monte Carlo methods. Ann. Stat. 38(6), 3387–3411 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  5. Carpenter, J., Clifford, P., Fearnhead, P.: Improved particle filter for nonlinear problems. IEE Proc. Radar Sonar Navig. 146(1), 2–7 (1999)CrossRefGoogle Scholar
  6. Chen, R., Liu, J.S.: Mixture Kalman filters. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 62(3), 493–508 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  7. Chen, Y., Diaconis, P., Holmes, S.P., Liu, J.S.: Sequential Monte Carlo methods for statistical analysis of tables. J. Am. Stat. Assoc. 100(469), 109–120 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  8. Cochran, W.G.: Sampling Techniques, 3rd edn. Wiley, New York (1977)zbMATHGoogle Scholar
  9. Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 68(3), 411–436 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  10. Douc, R., Cappé, O., Moulines, E.: Comparison of resampling schemes for particle filtering. In: ISPA 2005. In: Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis, pp 64–69 (2005)Google Scholar
  11. Doucet, A., de Freitas, N., Gordon, N. (eds.): Sequential Monte Carlo Methods in Practice. Statistics for Engineering and Information Science. Springer, New York (2001)Google Scholar
  12. Elperin, T.I., Gertsbakh, I., Lomonosov, M.: Estimation of network reliability using graph evolution models. IEEE Trans. Reliab. 40(5), 572–581 (1991)CrossRefzbMATHGoogle Scholar
  13. Fearnhead, P.: Sequential Monte Carlo Methods in Filter Theory. Ph.D. thesis, University of Oxford (1998)Google Scholar
  14. Fearnhead, P., Clifford, P.: On-line inference for hidden Markov models via particle filters. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 65(4), 887–899 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  15. Gerber, M., Chopin, N.: Sequential quasi Monte Carlo. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 77(3), 509–579 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  16. Gilks, W.R., Berzuini, C.: Following a moving target-Monte Carlo inference for dynamic Bayesian models. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 63(1), 127–146 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  17. Gordon, N., Salmond, D., Smith, A.: Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc. F Radar Signal Process. 140(2), 107–113 (1993)CrossRefGoogle Scholar
  18. Hammersley, J.M., Morton, K.W.: Poor man’s Monte Carlo. J. R. Stat. Soc. Ser. B (Methodol.) 16(1), 23–38 (1954)MathSciNetzbMATHGoogle Scholar
  19. Hartley, H.O., Rao, J.N.K.: Sampling with unequal probabilities and without replacement. Ann. Math. Stat. 33(2), 350–374 (1962)MathSciNetCrossRefzbMATHGoogle Scholar
  20. Horvitz, D.G., Thompson, D.J.: A generalization of sampling without replacement from a finite universe. J. Am. Stat. Assoc. 47(260), 663–685 (1952)MathSciNetCrossRefzbMATHGoogle Scholar
  21. Iachan, R.: Systematic sampling: a critical review. Int. Stat. Rev. 50(3), 293–303 (1982)MathSciNetCrossRefzbMATHGoogle Scholar
  22. Kong, A., Liu, J.S., Wong, W.H.: Sequential imputations and Bayesian missing data problems. J. Am. Stat. Assoc. 89(425), 278–288 (1994)CrossRefzbMATHGoogle Scholar
  23. Kou, S.C., McCullagh, P.: Approximating the \(\alpha \)-permanent. Biometrika 96(3), 635–644 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  24. L’Ecuyer, P., Rubino, G., Saggadi, S., Tuffin, B.: Approximate zero-variance importance sampling for static network reliability estimation. IEEE Trans. Reliab. 60(3), 590–604 (2011)CrossRefGoogle Scholar
  25. Liu, J.S.: Monte Carlo Strategies in Scientific Computing. Springer, New York (2001)zbMATHGoogle Scholar
  26. Liu, J.S., Chen, R.: Blind deconvolution via sequential imputations. J. Am. Stat. Assoc. 90(430), 567–576 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  27. Liu, J.S., Chen, R., Logvinenko, T.: A theoretical framework for sequential importance sampling with resampling. In: Doucet, A., de Freitas, N., Gordon, N. (eds.) Sequential Monte Carlo Methods in Practice. Statistics for Engineering and Information Science, pp. 225–246. Springer, New York (2001)CrossRefGoogle Scholar
  28. Lomonosov, M.: On Monte Carlo estimates in network reliability. Probab. Eng. Inf. Sci. 8, 245–264 (1994)CrossRefGoogle Scholar
  29. Madow, W.G.: On the theory of systematic sampling, II. Ann. Math. Stat. 20(3), 333–354 (1949)MathSciNetCrossRefzbMATHGoogle Scholar
  30. Madow, W.G., Madow, L.H.: On the theory of systematic sampling, I. Ann. Math. Stat. 15(1), 1–24 (1944)MathSciNetCrossRefzbMATHGoogle Scholar
  31. Marshall, A.: The use of multi-stage sampling schemes in Monte Carlo computations. In: Meyer, H.A. (ed.) Symposium on Monte Carlo Methods. Wiley, Hoboken (1956)Google Scholar
  32. Ó Ruanaidh, J.J.K., Fitzgerald, W.J.: Numerical Bayesian Methods Applied to Signal Processing. Springer, New York (1996)CrossRefzbMATHGoogle Scholar
  33. Paige, B., Wood, F., Doucet, A., Teh, Y.W.: Asynchronous anytime sequential Monte Carlo. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N., Weinberger, K. (eds.) Advances in Neural Information Processing Systems, vol. 27, Curran Associates, Inc., pp 3410–3418 (2014)Google Scholar
  34. Rosén, B.: Asymptotic theory for order sampling. J. Stat. Plan. Inference 62(2), 135–158 (1997a)MathSciNetCrossRefzbMATHGoogle Scholar
  35. Rosén, B.: On sampling with probability proportional to size. J. Stat. Plan. Inference 62(2), 159–191 (1997b)MathSciNetCrossRefzbMATHGoogle Scholar
  36. Rosenbluth, M.N., Rosenbluth, A.W.: Monte Carlo calculation of the average extension of molecular chains. J. Chem. Phys. 23(2), 356–359 (1955)CrossRefGoogle Scholar
  37. Rubinstein, R.Y., Kroese, D.P.: Simulation and the Monte Carlo Method, 3rd edn. Wiley, New York (2017)zbMATHGoogle Scholar
  38. Sampford, M.R.: On sampling without replacement with unequal probabilities of selection. Biometrika 54(3–4), 499–513 (1967)MathSciNetCrossRefGoogle Scholar
  39. Tillé, Y.: Sampling Algorithms. Springer, New York (2006)zbMATHGoogle Scholar
  40. Vaisman, R., Kroese, D.P.: Stochastic enumeration method for counting trees. Methodol. Comput. Appl. Probab. 19(1), 31–73 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  41. Wall, F.T., Erpenbeck, J.J.: New method for the statistical computation of polymer dimensions. J. Chem. Phys. 30(3), 634–637 (1959)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.School of Mathematics and PhysicsThe University of QueenslandBrisbaneAustralia

Personalised recommendations