Statistics and Computing

, Volume 23, Issue 2, pp 163–184 | Cite as

Sequential Monte Carlo on large binary sampling spaces

  • Christian SchäferEmail author
  • Nicolas Chopin


A Monte Carlo algorithm is said to be adaptive if it automatically calibrates its current proposal distribution using past simulations. The choice of the parametric family that defines the set of proposal distributions is critical for good performance. In this paper, we present such a parametric family for adaptive sampling on high dimensional binary spaces.

A practical motivation for this problem is variable selection in a linear regression context. We want to sample from a Bayesian posterior distribution on the model space using an appropriate version of Sequential Monte Carlo.

Raw versions of Sequential Monte Carlo are easily implemented using binary vectors with independent components. For high dimensional problems, however, these simple proposals do not yield satisfactory results. The key to an efficient adaptive algorithm are binary parametric families which take correlations into account, analogously to the multivariate normal distribution on continuous spaces.

We provide a review of models for binary data and make one of them work in the context of Sequential Monte Carlo sampling. Computational studies on real life data with about a hundred covariates suggest that, on difficult instances, our Sequential Monte Carlo approach clearly outperforms standard techniques based on Markov chain exploration.


Adaptive Monte Carlo Multivariate binary data Sequential Monte Carlo Linear regression Variable selection 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Albert, A., Anderson, J.A.: On the existence of maximum likelihood estimates in logistic regression models. Biometrika 72, 1–10 (1984) MathSciNetCrossRefGoogle Scholar
  2. Andrieu, C., Thoms, J.: A tutorial on adaptive MCMC. Stat. Comput. 18(4), 343–373 (2008) MathSciNetCrossRefGoogle Scholar
  3. Bahadur, R.: A representation of the joint distribution of responses to n dichotomous items. In: Solomon, H. (ed.) Studies in Item Analysis and Prediction, pp. 158–168. Stanford University Press, Stanford (1961) Google Scholar
  4. Bottolo, L., Richardson, S.: Evolutionary stochastic search for Bayesian model exploration. Bayesian Anal. 5(3), 583–618 (2010) MathSciNetCrossRefGoogle Scholar
  5. Cappé, O., Douc, R., Guillin, A., Marin, J., Robert, C.: Adaptive importance sampling in general mixture classes. Stat. Comput. 18(4), 447–459 (2008) MathSciNetCrossRefGoogle Scholar
  6. Carpenter, J., Clifford, P., Fearnhead, P.: Improved Particle Filter for nonlinear problems. IEE Proc. Radar Sonar Navig. 146(1), 2–7 (1999) CrossRefGoogle Scholar
  7. Chopin, N.: A sequential particle filter method for static models. Biometrika 89(3), 539 (2002) MathSciNetzbMATHCrossRefGoogle Scholar
  8. Clyde, M., Parmigiani, G.: Protein construct storage: Bayesian variable selection and prediction with mixtures. J. Biopharm. Stat. 8(3), 431 (1998) zbMATHCrossRefGoogle Scholar
  9. Clyde, M., Ghosh, J., Littman, M.: Bayesian adaptive sampling for variable selection and model averaging. J. Comput. Graph. Stat. 20(1), 80–101 (2011) MathSciNetCrossRefGoogle Scholar
  10. Cox, D.: The analysis of multivariate binary data. Appl. Stat. 113–120 (1972) Google Scholar
  11. Cox, D., Wermuth, N.: A note on the quadratic exponential binary distribution. Biometrika 81(2), 403–408 (1994) MathSciNetzbMATHCrossRefGoogle Scholar
  12. Cox, D., Wermuth, N.: On some models for multivariate binary variables parallel in complexity with the multivariate Gaussian distribution. Biometrika 89(2), 462 (2002) MathSciNetzbMATHCrossRefGoogle Scholar
  13. Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc., Ser. B, Stat. Methodol. 68(3), 411–436 (2006) MathSciNetzbMATHCrossRefGoogle Scholar
  14. Dongarra, J., Moler, C., Bunch, J., Stewart, G.: LINPACK: Users’ Guide. Society for Industrial and Applied Mathematics, Philadelphia (1979) CrossRefGoogle Scholar
  15. Emrich, L., Piedmonte, M.: A method for generating high dimensional multivariate binary variates. Am. Stat. 45, 302–304 (1991) Google Scholar
  16. Fearnhead, P., Clifford, P.: Online inference for hidden Markov models via particle filters. J. R. Stat. Soc., Ser. B, Stat. Methodol. 65(4), 887–899 (2003) MathSciNetzbMATHCrossRefGoogle Scholar
  17. Firth, D.: Bias reduction of maximum likelihood estimates. Biometrika 80, 27–38 (1993) MathSciNetzbMATHCrossRefGoogle Scholar
  18. Gelman, A., Meng, X.: Simulating normalizing constants: From importance sampling to bridge sampling to path sampling. Stat. Sci. 13(2), 163–185 (1998) MathSciNetzbMATHCrossRefGoogle Scholar
  19. Genest, C., Neslehova, J.: A primer on copulas for count data. ASTIN Bull. 37(2), 475 (2007) MathSciNetzbMATHCrossRefGoogle Scholar
  20. George, E.I., McCulloch, R.E.: Approaches for Bayesian variable selection. Stat. Sin. 7, 339–373 (1997) zbMATHGoogle Scholar
  21. Gilks, W., Berzuini, C.: Following a moving target Monte Carlo inference for dynamic Bayesian models. J. R. Stat. Soc., Ser. B, Stat. Methodol. 63(1), 127–146 (2001) MathSciNetzbMATHCrossRefGoogle Scholar
  22. Gordon, N.J., Salmond, D.J., Smith, A.F.M.: Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc. Radar Sonar Navig. 140(2), 107–113 (1993) Google Scholar
  23. Harrison, D., Rubinfeld, D.L.: Hedonic housing prices and the demand for clean air. J. Environ. Econ. Manag. 5(1), 81–102 (1978) zbMATHCrossRefGoogle Scholar
  24. Jasra, A., Stephens, D., Doucet, A., Tsagaris, T.: Inference for Lévy-Driven stochastic volatility models via adaptive sequential Monte Carlo. Scand. J. Stat. (2008) Google Scholar
  25. Joe, H.: Families of m-variate distributions with given margins and m (m−1)/2 bivariate dependence parameters. Lect. Notes Monogr. Ser. 28, 120–141 (1996) MathSciNetCrossRefGoogle Scholar
  26. Kitagawa, G.: Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. J. Comput. Graph. Stat. 5(1), 1–25 (1996) MathSciNetGoogle Scholar
  27. Kong, A., Liu, J.S., Wong, W.H.: Sequential imputation and Bayesian missing data problems. J. Am. Stat. Assoc. 89, 278–288 (1994) zbMATHCrossRefGoogle Scholar
  28. Lee, A.: Generating random binary deviates having fixed marginal distributions and specified degrees of association. Am. Stat. 47(3) (1993) Google Scholar
  29. Lee, A., Yau, C., Giles, M., Doucet, A., Holmes, C.: On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. J. Comput. Graph. Stat. 19(4), 769–789 (2010) CrossRefGoogle Scholar
  30. Leisch, F., Weingessel, A., Hornik, K.: On the generation of correlated artificial binary data. Technical report, WU Vienna University of Economics and Business (1998) Google Scholar
  31. Liang, F., Wong, W.: Evolutionary Monte Carlo: Applications to Cp model sampling and change point problem. Stat. Sin. 10(2), 317–342 (2000) zbMATHGoogle Scholar
  32. Liu, J.: Peskun’s theorem and a modified discrete-state Gibbs sampler. Biometrika 83(3), 681–682 (1996) MathSciNetzbMATHCrossRefGoogle Scholar
  33. Liu, J., Chen, R.: Sequential Monte Carlo methods for dynamic systems. J. Am. Stat. Assoc. 93(443), 1032–1044 (1998) zbMATHCrossRefGoogle Scholar
  34. Lunn, A., Davies, S.: A note on generating correlated binary variables. Biometrika 85(2), 487–490 (1998) MathSciNetzbMATHCrossRefGoogle Scholar
  35. Neal, R.: Annealed importance sampling. Stat. Comput. 11(2), 125–139 (2001) MathSciNetCrossRefGoogle Scholar
  36. Nelsen, R.: An Introduction to Copulas. Springer, Berlin (2006) zbMATHGoogle Scholar
  37. Nott, D., Kohn, R.: Adaptive sampling for Bayesian variable selection. Biometrika 92(4), 747 (2005) MathSciNetzbMATHCrossRefGoogle Scholar
  38. Oman, S., Zucker, D.: Modelling and generating correlated binary variables. Biometrika 88(1), 287 (2001) MathSciNetzbMATHCrossRefGoogle Scholar
  39. Park, C., Park, T., Shin, D.: A simple method for generating correlated binary variates. Am. Stat. 50(4) (1996) Google Scholar
  40. Qaqish, B.: A family of multivariate binary distributions for simulating correlated binary variables with specified marginal means and correlations. Biometrika 90(2), 455 (2003) MathSciNetzbMATHCrossRefGoogle Scholar
  41. Robert, C., Casella, G.: Monte Carlo Statistical Methods. Springer, Berlin (2004) zbMATHGoogle Scholar
  42. Schäfer, C.: Parametric families on large binary spaces. Technical report, Centre de Recherche en Economie et en Statistique, Paris (2011) Google Scholar
  43. Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978) zbMATHCrossRefGoogle Scholar
  44. Suchard, M., Holmes, C., West, M.: Some of the what?, why?, how?, who? and where? of graphics processing unit computing for Bayesian analysis. In: Bernardo, J.M. (ed.) Bayesian Statistics, vol. 9. Oxford University Press, London (2010) Google Scholar
  45. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc., Ser. B, Methodol. 58(1), 267–288 (1996) MathSciNetzbMATHGoogle Scholar
  46. Yeh, I.: Modeling of strength of high-performance concrete using artificial neural networks. Cem. Concr. Res. 28(12), 1797–1808 (1998) CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.Centre de Recherche en Économie et StatistiqueMalakoffFrance
  2. 2.CEntre de REcherches en MAthématiques de la DEcisionUniversité Paris-DauphineParisFrance
  3. 3.Ecole Nationale de la Statistique et de l’AdministrationMalakoffFrance

Personalised recommendations