Statistics and Computing

, Volume 25, Issue 1, pp 23–33 | Cite as

Pre-processing for approximate Bayesian computation in image analysis

  • Matthew T. Moores
  • Christopher C. Drovandi
  • Kerrie Mengersen
  • Christian P. Robert


Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden Potts model with additive Gaussian noise. Using a pre-processing step, we fit a binding function to model the relationship between the model parameters and the synthetic likelihood parameters. Our numerical experiments demonstrate that the precomputed binding function dramatically improves the scalability of ABC, reducing the average runtime required for model fitting from 71 h to only 7 min. We also illustrate the method by estimating the smoothing parameter for remotely sensed satellite imagery. Without precomputation, Bayesian inference is impractical for datasets of that scale.


Approximate Bayesian computation Hidden Markov random field Indirect inference Potts/Ising model Quasi-likelihood Sequential Monte Carlo 


  1. Beaumont, M.A., Zhang, W., Balding, D.J.: Approximate Bayesian computation in population genetics. Genetics 162(4), 2025–35 (2002)Google Scholar
  2. Beaumont, M.A., Cornuet, J.M., Marin, J.-M., Robert, C.P.: Adaptive approximate Bayesian computation. Biometrika 96(4), 90–983 (2009)CrossRefMathSciNetGoogle Scholar
  3. Blum, M.G.B., François, O.: Non-linear regression models for approximate Bayesian computation. Stat. Comput. 20(1), 63–73 (2010)CrossRefMathSciNetGoogle Scholar
  4. Cabras, S., Castellanos, M.E., Ruli, E.: A Quasi likelihood approximation of posterior distributions for likelihood-intractable complex models. Metron 72(2), 153–67 (2014)CrossRefMathSciNetGoogle Scholar
  5. Cook, S.R., Gelman, A., Rubin, D.B.: Validation of software for Bayesian models using posterior quantiles. J. Comput. Graph. Stat. 15(3), 675–92 (2006)CrossRefMathSciNetGoogle Scholar
  6. Cucala, L., Marin, J.-M., Robert, C.P., Titterington, D.M.: A Bayesian reassessment of nearest-neighbor classification. J. Am. Stat. Assoc. 104(485), 73–263 (2009)CrossRefMathSciNetGoogle Scholar
  7. Del Moral, P., Doucet, A., Jasra, A.: An adaptive sequential Monte Carlo method for approximate Bayesian computation. Stat. Comput. 22(5), 1009–1020 (2012)CrossRefMATHMathSciNetGoogle Scholar
  8. Douc, R., Cappé, O., Moulines, E.: Comparison of resampling schemes for particle filtering. In: Proceedings of 4th International Symposium image and signal processing and analysis (ISPA), pp. 64–69 (2005)Google Scholar
  9. Drovandi, C.C., Pettitt, A.N.: Estimation of parameters for macroparasite population evolution using approximate Bayesian computation. Biometrics 67(1), 225–33 (2011)CrossRefMATHMathSciNetGoogle Scholar
  10. Drovandi, C.C., Pettitt, A.N., Faddy, M.J.: Approximate Bayesian computation using indirect inference. J. R. Stat. Soc. Ser. C 60(3), 317–37 (2011)CrossRefMathSciNetGoogle Scholar
  11. Drovandi, C.C., Pettitt, A.N., Lee, A.: Bayesian indirect inference using a parametric auxiliary model. Stat. Sci. (2014).
  12. Eddelbuettel, D., Sanderson, C.: RcppArmadillo: accelerating R with high-performance C++ linear algebra. Comput. Stat. Data Anal. 71, 1054–1063 (2014)CrossRefMathSciNetGoogle Scholar
  13. Everitt, R.G.: Bayesian parameter estimation for latent Markov random fields and social networks. J. Comput. Graph. Stat. 21(4), 940–60 (2012)CrossRefMathSciNetGoogle Scholar
  14. Filippi, S., Barnes, C.P., Cornebise, J., Stumpf, M.P.H.: On optimality of kernels for approximate Bayesian computation using sequential Monte Carlo. Stat. Appl. Genet. Mol. Biol. 12(1), 87–107 (2013)MathSciNetGoogle Scholar
  15. Friel, N., Pettitt, A.N.: Classification using distance nearest neighbours. Stat. Comput. 21(3), 431–37 (2011)CrossRefMathSciNetGoogle Scholar
  16. Gouriéroux, C., Monfort, A., Renault, E.: Indirect inference. J. Appl. Econom. 8(S1), S85–S118 (1993)Google Scholar
  17. Grelaud, A., Robert, C.P., Marin, J.M., Rodolphe, F., Taly, J.F.: ABC likelihood-free methods for model choice in Gibbs random fields. Bayesian Anal. 4(2), 317–36 (2009)Google Scholar
  18. Higdon, D.M.: Auxiliary variable methods for Markov chain Monte Carlo with applications. J. Am. Stat. Assoc. 93(442), 585–95 (1998)Google Scholar
  19. Hurn, M.A.: Difficulties in the use of auxiliary variables in Markov chain Monte Carlo methods. Stat. Comput. 7, 35–44 (1997)CrossRefGoogle Scholar
  20. Jasra, A., Singh, S.S., Martin, J.S., McCoy, E.: Filtering via approximate Bayesian computation. Stat. Comput. 22, 1223–37 (2012)CrossRefMATHMathSciNetGoogle Scholar
  21. Kitagawa, G.: Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. J. Comput. Graph. Stat. 5(1), 1–25 (1996)MathSciNetGoogle Scholar
  22. Liu, J.S.: Monte Carlo Strategies in Scientific Computing. Springer, New York (2001)MATHGoogle Scholar
  23. McGrory, C.A., Titterington, D., Reeves, R., Pettitt, A.N.: Variational Bayes for estimating the parameters of a hidden Potts model. Stat. Comput. 19(3), 329–40 (2009)CrossRefMathSciNetGoogle Scholar
  24. Murray, I., Ghahramani, Z., MacKay, D.J.C.: MCMC for doubly-intractable distributions. In: Proceedings of 22nd Conference UAI, pp. 359–66, AUAI Press, Arlington, VA (2006)Google Scholar
  25. NASA: Landsat 7 science data users handbook. Tech. rep., National Aeronautics and Space Administration (2011).
  26. Potts, R.B.: Some generalized order-disorder transformations. Proc. Camb. Philos. Soc. 48, 106–9 (1952)CrossRefMATHMathSciNetGoogle Scholar
  27. Pritchard, J.K., Seielstad, M.T., Perez-Lezaun, A., Feldman, M.W.: Population growth of human Y chromosomes: a study of Y chromosome microsatellites. Mol. Biol. Evol. 16(12), 1791–1798 (1999)CrossRefGoogle Scholar
  28. Ratmann, O., Camacho, A., Meijer, A., Donker, G.: Statistical modelling of summary values leads to accurate Approximate Bayesian Computations. Techical report (2014). arXiv:1305.4283
  29. Sedki, M., Pudlo, P., Marin, J.-M., Robert, C.P., Cornuet, J.-M.: Efficient learning in ABC algorithms. Technical report (2013). arXiv:1210.1388
  30. Sisson, S.A., Fan, Y., Tanaka, M.M.: Sequential Monte Carlo without likelihoods. Proc. Natl. Acad. Sci. 104(6), 1760–1765 (2007)CrossRefMATHMathSciNetGoogle Scholar
  31. Stoehr, J., Pudlo, P., Cucala, L.: Geometric summary statistics for ABC model choice between hidden Gibbs random fields. Stat. Comput. (2014). doi:10.1007/s11222-014-9514-9
  32. Swendsen, R.H., Wang, J.S.: Nonuniversal critical dynamics in Monte Carlo simulations. Phys. Rev. Lett. 58, 86–88 (1987)CrossRefGoogle Scholar
  33. Toni, T., Welch, D., Strelkowa, N., Ipsen, A., Stumpf, M.P.H.: Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems. J. R. Soc. Interface 6(31), 187–202 (2009)CrossRefGoogle Scholar
  34. Wilkinson, R.: accelerating ABC methods using Gaussian processes. In: Proceedings of 17th International Conference AISTATS, JMLR W&CP 33, 1015–23 (2014)Google Scholar
  35. Winkler, G.: Image Analysis, Random Fields and Markov chain Monte Carlo Methods: A Mathematical Introduction, 2nd edn. Springer-Verlag, Heidelberg (2003)CrossRefGoogle Scholar
  36. Wood, S.N.: Statistical inference for noisy nonlinear ecological dynamic systems. Nature 466, 1102–1104 (2010)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  • Matthew T. Moores
    • 1
    • 2
  • Christopher C. Drovandi
    • 1
  • Kerrie Mengersen
    • 1
  • Christian P. Robert
    • 2
    • 3
  1. 1.Mathematical Sciences SchoolQueensland University of TechnologyBrisbaneAustralia
  2. 2.Department of StatisticsUniversity of WarwickCoventryUK
  3. 3.CEREMADEUniversité Paris Dauphine and CREST, INSEEParisFrance

Personalised recommendations