Skip to main content

Computational Properties of Probabilistic Neural Networks

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6354))

Abstract

We discuss the problem of overfitting of probabilistic neural networks in the framework of statistical pattern recognition. The probabilistic approach to neural networks provides a statistically justified subspace method of classification. The underlying structural mixture model includes binary structural parameters and can be optimized by EM algorithm in full generality. Formally, the structural model reduces the number of parameters included and therefore the structural mixtures become less complex and less prone to overfitting. We illustrate how recognition accuracy and the effect of overfitting is influenced by mixture complexity and by the size of training data set.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. of the Royal Stat. Soc. B 39, 1–38 (1977)

    MathSciNet  Google Scholar 

  2. Dietterich, T.: Overfitting and undercomputing in machine learning. ACM Computing Surveys 27(3), 326–327 (1995)

    Article  Google Scholar 

  3. Grim, J.: Multivariate statistical pattern recognition with nonreduced dimensionality. Kybernetika 22, 142–157 (1986)

    MATH  MathSciNet  Google Scholar 

  4. Grim, J.: Information approach to structural optimization of probabilistic neural networks. In: Ferrer, L., et al. (eds.) Proceedings of 4th System Science European Congress, pp. 527–540. Soc. Espanola de Sistemas Generales, Valencia (1999)

    Google Scholar 

  5. Grim, J., Just, P., Pudil, P.: Strictly modular probabilistic neural networks for pattern recognition. Neural Network World 13, 599–615 (2003)

    Google Scholar 

  6. Grim, J.: Neuromorphic features of probabilistic neural networks. Kybernetika 43(6), 697–712 (2007)

    MATH  MathSciNet  Google Scholar 

  7. Grim, J., Hora, J.: Iterative principles of recognition in probabilistic neural networks. Neural Networks 21(6), 838–846 (2008)

    Article  Google Scholar 

  8. McLachlan, G.J., Peel, D.: Finite Mixture Models. John Wiley and Sons, New York (2000)

    Book  MATH  Google Scholar 

  9. Sarle, W.S.: Stopped training and other remedies for overfitting. In: Proceedings of the 27th Symposium on the Interface (1995), ftp://ftp.sas.com/pub/neural/inter95.ps.Z

  10. Schaffer, C.: Overfitting avoidance as Bias. Machine Learning 10(2), 153–178 (1993)

    MathSciNet  Google Scholar 

  11. Schlesinger, M.I.: Relation between learning and self-learning in pattern recognition. Kibernetika (Kiev) (2), 81–88 (1968) (in Russian)

    Google Scholar 

  12. Specht, D.F.: Probabilistic neural networks for classification, mapping or associative memory. In: Proc. IEEE Int. Conf. on Neural Networks, vol. I, pp. 525–532 (1988)

    Google Scholar 

  13. Vajda, I.: Theory of Statistical Inference and Information. Kluwer, Boston (1992)

    Google Scholar 

  14. Yinyin, L., Starzyk, J.A., Zhu, Z.: Optimized Approximation Algorithm in Neural Networks Without Overfitting. IEEE Tran. Neural Networks 19, 983–995 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Grim, J., Hora, J. (2010). Computational Properties of Probabilistic Neural Networks. In: Diamantaras, K., Duch, W., Iliadis, L.S. (eds) Artificial Neural Networks – ICANN 2010. ICANN 2010. Lecture Notes in Computer Science, vol 6354. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15825-4_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-15825-4_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-15824-7

  • Online ISBN: 978-3-642-15825-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics