Advertisement

Statistics and Computing

, 19:329 | Cite as

Variational Bayes for estimating the parameters of a hidden Potts model

  • C. A. McGroryEmail author
  • D. M. Titterington
  • R. Reeves
  • A. N. Pettitt
Article

Abstract

Hidden Markov random field models provide an appealing representation of images and other spatial problems. The drawback is that inference is not straightforward for these models as the normalisation constant for the likelihood is generally intractable except for very small observation sets. Variational methods are an emerging tool for Bayesian inference and they have already been successfully applied in other contexts. Focusing on the particular case of a hidden Potts model with Gaussian noise, we show how variational Bayesian methods can be applied to hidden Markov random field inference. To tackle the obstacle of the intractable normalising constant for the likelihood, we explore alternative estimation approaches for incorporation into the variational Bayes algorithm. We consider a pseudo-likelihood approach as well as the more recent reduced dependence approximation of the normalisation constant. To illustrate the effectiveness of these approaches we present empirical results from the analysis of simulated datasets. We also analyse a real dataset and compare results with those of previous analyses as well as those obtained from the recently developed auxiliary variable MCMC method and the recursive MCMC method. Our results show that the variational Bayesian analyses can be carried out much faster than the MCMC analyses and produce good estimates of model parameters. We also found that the reduced dependence approximation of the normalisation constant outperformed the pseudo-likelihood approximation in our analysis of real and synthetic datasets.

Keywords

Potts/Ising model Hidden Markov random field Variational approximation Bayesian inference Pseudo-likelihood Reduced dependence approximation 

References

  1. Attias, H.: Inferring parameters and structure of latent variable models by variational Bayes. In: Proceedings of 15th Conference on Uncertainty in Artificial Intelligence (1999) Google Scholar
  2. Beal, M.J., Ghahramani, Z.: The variational Bayesian EM algorithm for incomplete data: with application to scoring graphical model structures. In: Bernardo, J.M., Bayarri, M.J., Berger, J.O., David, A.P., Heckerman, D., Smith, A.F.M., West, M. (eds.) Proceedings of the Seventh Valencia International Meeting. Bayesian Statistics, vol. 7, pp. 453–464. Oxford University Press, London (2003) Google Scholar
  3. Besag, J.: Spatial interaction and the statistical analysis of lattice systems (with discussion). J. R. Stat. Soc. Ser. B 36, 192–236 (1974) zbMATHMathSciNetGoogle Scholar
  4. Besag, J.: Statistical analysis of non-lattice data. Statistician 24, 179–195 (1975) CrossRefGoogle Scholar
  5. Besag, J.: On the statistical analysis of dirty pictures (with discussion). J. R. Stat. Soc. Ser. B 48, 259–302 (1986) zbMATHMathSciNetGoogle Scholar
  6. Besag, J., York, J., Mollie, A.: Bayesian image restoration, with two applications in spatial statistics (with discussion). Ann. Inst. Stat. Math. 43, 1–59 (1991) zbMATHCrossRefMathSciNetGoogle Scholar
  7. Besag, J., Higdon, D.: Bayesian analysis of agricultural field experiments. J. R. Stat. Soc. Ser. B 61, 691–746 (1999) zbMATHCrossRefMathSciNetGoogle Scholar
  8. Buck, C.E., Cavanagh, W.G., Litton, C.D.: The spatial analysis of site phosphate data. In: Rhatz, S.P.Q. (ed.) Computer Applications and Quantitive Methods in Archeology. British Archaeological Reports, International Series, vol. 446. BAR, Oxford (1988) Google Scholar
  9. Corduneanu, A., Bishop, C.M.: Variational Bayesian model selection for mixture distributions. In: Jaakkola, T., Richardson, T. (eds.) Artificial Intelligence and Statistics, pp. 27–34. Morgan Kaufmann, San Mateo (2001) Google Scholar
  10. Friel, N., Pettitt, A.N., Reeves, R., Wit, E.: Bayesian inference in hidden Markov random fields for binary data defined on large lattices (2008, submitted) Google Scholar
  11. Gelman, A., Meng, X.L.: Simulating normalizing constants: from importance sampling to bridge sampling to path sampling. Stat. Sci. 13, 163–185 (1998) zbMATHCrossRefMathSciNetGoogle Scholar
  12. Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell. 6, 721–741 (1984) zbMATHCrossRefGoogle Scholar
  13. Gottardo, R., Besag, J., Stephens, M., Murua, A.: Probabilistic segmentation and intensity estimation for microarray images. Biostatistics 7, 85–89 (2006) zbMATHCrossRefGoogle Scholar
  14. Green, P.J., Richardson, S.: Hidden Markov models and disease mapping. J. Am. Stat. Assoc. 97, 1055–1070 (2002) zbMATHCrossRefMathSciNetGoogle Scholar
  15. Jordan, M.: Graphical models. Stat. Sci. 19, 140–155 (2004) zbMATHCrossRefGoogle Scholar
  16. MacKay, D.J.C.: Ensemble learning for hidden Markov models. Technical Report, Cavendish Laboratory, University of Cambridge (1997) Google Scholar
  17. McGrory, C.A.: Variational Approximations in Bayesian Model Selection. Ph.D. Thesis, University of Glasgow, UK (2005) Google Scholar
  18. McGrory, C.A., Titterington, D.M.: Variational approximations in Bayesian model selection for finite mixture distributions. Comput. Stat. Data Anal. 51, 5352–5367 (2007) zbMATHCrossRefMathSciNetGoogle Scholar
  19. McGrory, C.A., Titterington, D.M.: Bayesian analysis of hidden Markov models using variational approximations. Aust. N. Z. J. Stat. (2008, to appear) Google Scholar
  20. Møller, J., Pettitt, A.N., Reeves, R., Berthelsen, K.K.: An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants. Biometrika 93, 451–458 (2006) CrossRefMathSciNetGoogle Scholar
  21. Murray, I., Ghahramani, Z., MacKay, D.J.C.: MCMC for doubly-intractable distributions. In: Proceedings of 22nd Conference on Uncertainty in Artificial Intelligence (2006) Google Scholar
  22. Newman, M.E.J., Barkema, G.T.: Monte Carlo Methods in Statistical Physics. Oxford University Press, London (1999) zbMATHGoogle Scholar
  23. Pettitt, A.N., Friel, N., Reeves, R.: Efficient calculation of the normalising constant of the autologistic and related models on the cylinder and lattice. J. R. Stat. Soc. Ser. B 65, 235–247 (2003) zbMATHCrossRefMathSciNetGoogle Scholar
  24. Reeves, R., Pettitt, A.N.: Efficient recursions for general factorisable models. Biometrika 91, 751–757 (2004) zbMATHCrossRefMathSciNetGoogle Scholar
  25. Rydén, T., Titterington, D.M.: Computational Bayesian analysis of hidden Markov models. J. Comput. Graph. Stat. 7, 194–211 (1998) CrossRefGoogle Scholar
  26. Smith, M., Fahrmeir, L.: Spatial Bayesian variable selection with application to functional magnetic resonance imaging. J. Am. Stat. Assoc. 102, 417–431 (2007) zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2008

Authors and Affiliations

  • C. A. McGrory
    • 1
    Email author
  • D. M. Titterington
    • 2
  • R. Reeves
    • 1
  • A. N. Pettitt
    • 1
  1. 1.School of Mathematical SciencesQueensland University of TechnologyBrisbaneAustralia
  2. 2.University of GlasgowGlasgowUK

Personalised recommendations