Statistics and Computing

, Volume 27, Issue 2, pp 403–422 | Cite as

Bayesian model comparison with un-normalised likelihoods

  • Richard G. Everitt
  • Adam M. Johansen
  • Ellen Rowing
  • Melina Evdemon-Hogan


Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.


Approximate Bayesian computation  Bayes’ factors Importance sampling Marginal likelihood Markov random field Partition function  Sequential Monte Carlo 



The authors would like to thank Nial Friel for useful discussions, and for giving us access to the data and results from Friel (2013).

Supplementary material


  1. Alquier, P., Friel, N., Everitt, R.G., Boland, A.: Noisy Monte Carlo: Convergence of Markov chains with approximate transition kernels. Statistics and Computing In press (2015)Google Scholar
  2. Andrieu, C., Roberts, G.O.: The pseudo-marginal approach for efficient Monte Carlo computations. Ann Stat 37(2), 697–725 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  3. Andrieu, C., Vihola, M.: Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms (2012). arXiv:1210.1484
  4. Beaumont, M.A.: Estimation of population growth or decline in genetically monitored populations. Genetics 164(3), 1139–1160 (2003)Google Scholar
  5. Beskos, A., Crisan, D., Jasra, A., Whiteley, N.: Error bounds and normalizing constants for sequential Monte Carlo in high dimensions (2011). arXiv:1112.1544
  6. Caimo, A., Friel, N.: Bayesian inference for exponential random graph models. Soc Netw 33, 41–55 (2011)CrossRefGoogle Scholar
  7. Chopin, N.: A sequential particle filter method for static models. Biometrika 89(3), 539–552 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  8. Chopin, N., Jacob, P.E., Papaspiliopoulos, O.: \(\text{ SMC }^2\): an efficient algorithm for sequential analysis of state space models. J R Stat Soc 75(3), 397–426 (2013)MathSciNetCrossRefGoogle Scholar
  9. Del Moral, P.: Feynman-Kac Formulae: Genealogical and Interacting Particle Systems with Applications. Probability and Its Applications. Springer, New York (2004)CrossRefzbMATHGoogle Scholar
  10. Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J R Stat Soc 68(3), 411–436 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  11. Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo for Bayesian computation. Bayesian Stat 8, 115–148 (2007)MathSciNetzbMATHGoogle Scholar
  12. Didelot, X., Everitt, R.G., Johansen, A.M., Lawson, D.J.: Likelihood-free estimation of model evidence. Bayesian Anal 6(1), 49–76 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  13. Drovandi, C.C., Pettitt, A.N., Lee, A.: Bayesian indirect inference using a parametric auxiliary model. Stat Sci 30(1), 72–95 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  14. Everitt, R.G.: Bayesian parameter estimation for latent Markov random fields and social networks. J Comput Graph Stat 21(4), 940–960 (2012)MathSciNetCrossRefGoogle Scholar
  15. Fearnhead, P., Papaspiliopoulos, O., Roberts, G.O., Stuart, A.M.: Random-weight particle filtering of continuous time processes. J R Stat Soc 72(4), 497–512 (2010)MathSciNetCrossRefGoogle Scholar
  16. Friel, N.: Evidence and Bayes factor estimation for Gibbs random fields. J Comput GraphStat 22(3), 518–532 (2013)MathSciNetCrossRefGoogle Scholar
  17. Friel, N., Rue, H.: Recursive computing and simulation-free inference for general factorizable models. Biometrika 94(3), 661–672 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  18. Girolami, M.A., Lyne, A.M., Strathmann, H., Simpson, D., Atchade, Y.: Playing Russian roulette with intractable likelihoods (2013). arXiv:1306.4032
  19. Grelaud, A., Robert, C.P., Marin, J.M.: ABC likelihood-free methods for model choice in Gibbs random fields. Bayesian Anal 4(2), 317–336 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  20. Johndrow, J.E., Mattingly, J.C., Mukherjee, S., Dunson, D.: Approximations of Markov chains and high-dimensional Bayesian inference (2015). arXiv:1508.03387
  21. Klaas, M., de Freitas, N., Doucet, A.: Toward practical \(N^2\) Monte Carlo: The marginal particle filter. In: Proceedings of the 20th International Conference on Uncertainty in Artificial Intelligence (2005)Google Scholar
  22. Kong, A., Liu, J.S., Wong, W.H.: Sequential imputations and Bayesian missing data problems. J Am Stat Assoc 89(425), 278–288 (1994)CrossRefzbMATHGoogle Scholar
  23. Lee, A., Whiteley, N.: Variance estimation and allocation in the particle filter (2015). arXiv:2015.0394
  24. Marin, J.M., Pillai, N.S., Robert, C.P., Rousseau, J.: Relevant statistics for Bayesian model choice. J R Stat Soc 76(5), 833–859 (2014)MathSciNetCrossRefGoogle Scholar
  25. Marjoram, P., Molitor, J., Plagnol, V., Tavare, S.: Markov chain Monte Carlo without likelihoods. Proc Natl Acad Sci USA 100(26), 15324–15328 (2003)CrossRefGoogle Scholar
  26. Meng, Xl, Wong, W.H.: Simulating ratios of normalizing constants via a simple identity: a theoretical exploration. Stat Sin 6, 831–860 (1996)Google Scholar
  27. Møller, J., Pettitt, A.N., Reeves, R.W., Berthelsen, K.K.: An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants. Biometrika 93(2), 451–458 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  28. Murray, I., Ghahramani, Z., MacKay, D.J.C.: MCMC for doubly-intractable distributions. In: Proceedings of the 22nd Annual Conference on Uncertainty in Artificial Intelligence (UAI), pp. 359–366 (2006)Google Scholar
  29. Neal, R.M.: Annealed importance sampling. Stat Comput 11(2), 125–139 (2001)Google Scholar
  30. Neal, R.M.: Estimating ratios of normalizing constants using linked importance sampling (2005). arXiv:0511.1216
  31. Nicholls, G.K., Fox, C., Watt, A.M.: Coupled MCMC With a randomized acceptance probability (2012). arXiv:1205.6857
  32. Peters, G.W.: Topics in sequential Monte Carlo samplers. M.Sc. thesis, Unviersity of Cambridge (2005)Google Scholar
  33. Picchini, U., Forman, J.L.: Accelerating inference for diffusions observed with measurement error and large sample sizes using approximate Bayesian computation: a case study (2013). arXiv:1310.0973
  34. Prangle, D., Fearnhead, P., Cox, M.P., Biggs, P.J., French, N.P.: Semi-automatic selection of summary statistics for ABC model choice. Stat Appl Genet Mol Biol 13(1), 67–82 (2014)MathSciNetzbMATHGoogle Scholar
  35. Rao, V., Lin, L., Dunson, D.B.: Bayesian inference on the Stiefel manifold (2013). arXiv:1311.0907
  36. Robert, C.P., Cornuet, J.M., Marin, J.M., Pillai, N.S.: Lack of confidence in approximate Bayesian computation model choice. Proc Natl AcadSci USA 108(37), 15,112–7 (2011)Google Scholar
  37. Schweinberger, M., Handcock, M.: J R Stat Soc 77, 647–676 (2015)CrossRefGoogle Scholar
  38. Sisson, S.A., Fan, Y., Tanaka, M.M.: Sequential monte carlo without likelihoods. Proc Natl AcadSci USA 104(6), 1760–1765 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  39. Skilling, J.: Nested sampling for general Bayesian computation. Bayesian Analysis 1(4), 833–859 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  40. Tavaré, S., Balding, D.J., Griffiths, R.C., Donnelly, P.J.: Inferring Coalescence Times From DNA Sequence Data. Genetics 145(2), 505–518 (1997)Google Scholar
  41. Tran, M.N., Scharth, M., Pitt, M.K., Kohn, R.: \(\text{ IS }^2\) for Bayesian inference in latent variable models (2013). arXiv:1309.3339
  42. Whiteley, N.: Stability properties of some particle filters. Annals of Applied Probability 23(6), 2500–2537 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  43. Wilkinson, R.D.: Approximate Bayesian computation (ABC) gives exact results under the assumption of model error. Statistical Applications in Genetics and Molecular Biology 12(2), 129–141 (2013)MathSciNetCrossRefGoogle Scholar
  44. Wood, S.N.: Statistical inference for noisy nonlinear ecological dynamic systems. Nature 466(August), 1102–1104 (2010)CrossRefGoogle Scholar
  45. Zhou, Y., Johansen, A.M., Aston, J.A.D.: Towards automatic model comparison: An adaptive sequential Monte Carlo approach. Journal of Computational and Graphical Statistics In press (2015)Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Richard G. Everitt
    • 1
  • Adam M. Johansen
    • 2
  • Ellen Rowing
    • 1
  • Melina Evdemon-Hogan
    • 1
  1. 1.Department of Mathematics and StatisticsUniversity of ReadingReadingUK
  2. 2.Department of StatisticsUniversity of WarwickCoventryUK

Personalised recommendations