Emerging themes on information theory and Bayesian approach

Editorial
  • 263 Downloads

References

  1. 1.
    Shannon C E. A mathematical theory of communication. Bell System Technical Journal, 1948, 27: 379–423, 623–656MATHMathSciNetGoogle Scholar
  2. 2.
    Rao C R. Information and accuracy attainable in the estimation of statistical parameters. Bulletin of the Calcutta Mathematical Society, 1945, 37: 81–91MATHMathSciNetGoogle Scholar
  3. 3.
    Kullback S, Leibler R A. On information and sufficiency. Annals of Mathematical Statistics, 1951, 22(1): 79–86MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Jaynes E T. Information theory and statistical mechanics. Physical Review, 1957, 106(4): 620–630CrossRefMathSciNetGoogle Scholar
  5. 5.
    Shore J, Johnson R. Properties of cross-entropy minimization. IEEE Transactions on Information Theory, 1981, 27(4): 472–482MATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Chentsov N N. Statistical Decision Rules and Optimal Inference, Translations of Mathematical Monographs; v. 53. American Mathematical Society, 1982Google Scholar
  7. 7.
    Amari S. Differential-Geometrical Methods in Statistics. Lecture Notes in Statistics, Berlin: Springer-Verlag, 1985MATHGoogle Scholar
  8. 8.
    Dempster A P, Laird N M, Rubin D B. Maximum-likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society. Series B, 1977, 39(1): 1–38MATHMathSciNetGoogle Scholar
  9. 9.
    Yuille A L, Kersten D. Vision as Bayesian inference: Analysis by synthesis? Trends in Cognitive Sciences, 2006, 10(7): 301–308CrossRefGoogle Scholar
  10. 10.
    Xu L. YING-YANG machines: A Bayesian-Kullback scheme for unified learning and new results on vector quantization. In: Proceedings of the International Conference on Neural Information Processing (ICONIP95). 1995, 977–988Google Scholar
  11. 11.
    Hinton G E, Dayan P, Frey B J, Neal R N. The wake-sleep algorithm for unsupervised learning neural networks. Science, 1995, 268(5214): 1158–1160CrossRefGoogle Scholar
  12. 12.
    Xu L. Temporal BYY learning for state space approach, hidden Markov model and blind source separation. IEEE Transactions on Signal Processing, 2000, 48(7): 2132–2144MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Akaike H. A new look at the statistical model identification. IEEE Transactions on Automatic Control, 1974, 19(6): 716–723MATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Solomonoff R J. A formal theory of inductive inference. Part I. Information and Control, 1964, 7(1): 1–22MATHCrossRefMathSciNetGoogle Scholar
  15. 15.
    Kolmogorov A N. Three approaches to the quantitative definition of information. Problems of Information Transmission, 1965, 1(1): 1–11MathSciNetGoogle Scholar
  16. 16.
    Wallace C S, Boulton D M. An information measure for classification. Computer Journal, 1968, 11(2): 185–194MATHGoogle Scholar
  17. 17.
    Schwarz G. Estimating the dimension of a model. Annals of Statistics, 1978, 6(2): 461–464MATHCrossRefMathSciNetGoogle Scholar
  18. 18.
    Rissanen J. Modeling by shortest data description. Automatica, 1978, 14: 465–471MATHCrossRefGoogle Scholar
  19. 19.
    MacKay D J C. Bayesian interpolation. Neural Computation, 1992, 4(3): 415–447CrossRefGoogle Scholar
  20. 20.
    McGrory C A, Titterington D M. Variational approximations in Bayesian model selection for finite mixture distributions. Computational Statistics & Data Analysis, 2007, 51(11): 5352–5367MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Higher Education Press and Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringThe Chinese University of Hong KongHong KongChina
  2. 2.Department of AutomationTsinghua UniversityBeijingChina

Personalised recommendations