Advertisement

Asymptotic Behavior of Stochastic Complexity of Complete Bipartite Graph-Type Boltzmann Machines

  • Yu Nishiyama
  • Sumio Watanabe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4232)

Abstract

In singular statistical models, it was shown that Bayes learning is effective. However, on Bayes learning, calculation containing the Bayes posterior distribution requires huge computational costs. To overcome the problem, mean field approximation (or equally variational Bayes method) was proposed. Recently, the generalization error and stochastic complexity in mean field approximation have been theoretically studied. In this paper, we treat the complete bipartite graph-type Boltzmann machines and derive the upper bound of the asymptotic stochastic complexity in mean field approximation.

Keywords

Learning Model Fisher Information Hide Unit Generalization Error True Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Watanabe, S.: Algebraic analysis for nonidentifiable learning machines. Neural Computation 13(4), 899–933 (2001)CrossRefMATHGoogle Scholar
  2. 2.
    Aoyagi, M., Watanabe, S.: The generalization error of reduced rank regression in bayesian estimation. In: Proc. of ISITA 2004, Italy, pp. 1068–1073 (2004)Google Scholar
  3. 3.
    Yamazaki, K., Watanabe, S.: Singularities in mixture models and upper bounds of stochastic complexity. International Journal of Neural Networks 16(7), 1029–1038 (2003)CrossRefGoogle Scholar
  4. 4.
    Nakajima, S., Watanabe, S.: Generalization Error and Free Energy of Linear Neural Networks in Variational Bayes Approach. In: Proc. of ICONI 2005, Taiwan, pp. 55–60 (2005)Google Scholar
  5. 5.
    Watanabe, K., Watanabe, S.: Lower bounds of stochastic complexities in variational Bayes learning of gaussian mixture models. In: Proc. IEEE conference on Cybernetics and Intelligent Systems, pp. 99–104 (2004)Google Scholar
  6. 6.
    Hosino, T., Watanabe, K., Watanabe, S.: Stochastic Complexity of Variational Bayesian Hidden Markov Models. In: Proc. of IJCNN 2005, Canada (2005)Google Scholar
  7. 7.
    Hosino, T., Watanabe, K., Watanabe, S.: Stochastic Complexity of Stochastic Context Free Grammer on Variational Bayesian method. IEICE Technicalreport, NC2005-49 (October 2005)Google Scholar
  8. 8.
    Nakano, N., Watanabe, S.: Stochastic Complexity of Layered Neural Networks in Mean Field Approximation. In: Proc. of ICONI 2005, Taiwan, pp. 332–337 (2005)Google Scholar
  9. 9.
    Nishiyama, Y., Watanabe, S.: Asymptotic Behavior of Free Energy of General Boltzmann Machines in Mean Field Approximation, IEICE Technical report, (July 2006) (to appear)Google Scholar
  10. 10.
    Yamazaki, K., Watanabe, S.: Singularities in complete bipartite graph-type boltzmann machines and upper bounds of stochastic complexities. IEEE Trans. Neural Networks 16(2), 312–324 (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yu Nishiyama
    • 1
  • Sumio Watanabe
    • 2
  1. 1.Department of Computational Intelligence and Systems ScienceTokyo Institute of TechnologyYokohamaJapan
  2. 2.Precision and Intelligence LaboratoryTokyo Institute of TechnologyYokohamaJapan

Personalised recommendations