Advertisement

A Neural Network Structure Evolution Algorithm Based on e, m Projections and Model Selection Criterion

  • Yunhui Liu
  • Siwei Luo
  • Ziang Lv
  • Hua Huang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)

Abstract

According to biological and neurophysiologic research, there is a bloom bursting of synapses in brain’s physiological growing process of newborn infants. These jillion nerve connections will be pruned and the dendrites of neurons can change their conformation in infants’ proceeding cognition process. Simulating this pruning process, a new neural network structure evolution algorithm is proposed based on e and m projections in information geometry and model selection criterion. This structure evolution process is formulated in iterative e, m projections and stopped by using model selection criterion. Experimental results prove the validation of the algorithm.

Keywords

Bayesian Information Criterion Minimum Description Length Model Selection Criterion Information Geometry Flat Manifold 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Rosemary, A.R.: Cognitive Development: Psychological and Biological Perspectives. Allyn & Bacon, Boston MA (1994)Google Scholar
  2. 2.
    Kalat, J.W.: Introduction to Psychology, 4th edn. Brooks/Cole Publ. Co., Pacific Grove (1996)Google Scholar
  3. 3.
    Amari, S.: Differential-geometrical Method in Statistics. Lecture Notes in Statistics. Springer, Heidelberg (1985)Google Scholar
  4. 4.
    Amari, S.: Information Geometry on Hierarchy of Probability Distributions. IEEE Trans. Information Theory 47(5), 1701–1711 (2001)MATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Liu, Y.H., Luo, S.W.: Information Geometric Analysis of Neural Network Pruning. Journal of Computer Research and Development (2006) (to be Published)Google Scholar
  6. 6.
    Akaike, H.: Information Theory and an Extension of the Maximum Likelihood Principle. In: Petrov, B.N., Csaki, F. (eds.) 2nd Int. Symposium on Info. Theory, Budapest, pp. 267–281 (1973)Google Scholar
  7. 7.
    Heckerman, D., Chickering, D.: A Comparison of Scientific and Engineering Criteria for Bayesian Model Selection. Statistics and Computing 10(1), 55–62 (2000)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Barron, A.R., Rissanen, J., Yu, B.: The Minimum Description Length Principle in Coding and Modeling. IEEE Trans. Information Theory 44(6), 2743–2760 (1998)MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yunhui Liu
    • 1
  • Siwei Luo
    • 1
  • Ziang Lv
    • 1
  • Hua Huang
    • 1
  1. 1.School of Computer and Information TechnologyBeijing Jiaotong UniversityBeijingChina

Personalised recommendations