Skip to main content

Unsupervised Neural Learning Algorithms

  • Chapter
  • 1770 Accesses

Abstract

This chapter provides a thorough review of the classical algorithms on unsupervised neural learning. It begins with a brief introduction to recurrent neural topology and then presents in detail both binary and continuous Hopfield nets, their stability analysis and applications. The chapter also presents a detailed overview to adaptive resonance theory and its application in solving the ’stability plasticity conflict’ problem in classical pattern recognition. Finally, the chapter introduces fuzzy associative memory neural nets and outlines algorithms for pattern classification by the proposed neural nets. Concluding remarks are listed at the end of the chapter.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Carpenter, G. A. and Grossberg, S., “Adaptive Resonance Theory (ART),” In The Handbook of Brain Theory and Neural Networks, Arbib, M. A. (Ed.), MIT Press, Cambridge, MA, pp. 79–82, 1995.

    Google Scholar 

  2. Carpenter, G. A. and Grossberg, S., “A massively parallel architecture for a self-organizing neural pattern recognition machine,” Computer Vision, Graphics and Image Processing, Academic Press, vol. 37, pp. 54–115, 1987.

    Article  Google Scholar 

  3. Freeman, J. A. and Sakpura, D. M., Neural Networks: Algorithms, Applications and Programming Techniques, MA, 1991.

    Google Scholar 

  4. Fu, L, M., Neural Networks in Computer Intelligence, McGraw-Hill, NY, 1994.

    Google Scholar 

  5. Gonzalez, R. C. and Woods, R. E., Digital Image Processing, Addison-Wesley, Reading, MA, 2000.

    Google Scholar 

  6. Hinton, G. E., “Deterministic Boltzman machine learning performs steepest descent in weight-space,” Neural Computation, vol. 1, pp. 143–150, 1989..

    Article  Google Scholar 

  7. Hopfield, J. J., “Neural networks and physical systems with emergent collective computational ability,” Proc. of Natl. Academy of Science, USA, vol. 79, pp. 2554–2558, April 1982.

    Article  MathSciNet  Google Scholar 

  8. Hopfield, J. J., “Neurons with graded response have collective computational properties like those of two-state neurons,” Proc. of Natl. Academy of Science, USA, vol. 81, pp. 3088–3092, May 1984.

    Article  Google Scholar 

  9. Jain, A. K., Mao, J. and Mohiuddin, K. M., “Artificial Neural Networks: a Tutorial,” IEEE Computer, pp. 31–44, March 1996.

    Google Scholar 

  10. Kosko, B., “Bidirectional Associative Memories,” IEEE Trans. on Systems, Man and Cybernetics, vol. 18, pp. 49–60, 1992.

    Article  MathSciNet  Google Scholar 

  11. Kosko, B., Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence, Prentice-Hall, NJ, 1991.

    Google Scholar 

Download references

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

(2005). Unsupervised Neural Learning Algorithms. In: Computational Intelligence. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-27335-2_9

Download citation

  • DOI: https://doi.org/10.1007/3-540-27335-2_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-20898-3

  • Online ISBN: 978-3-540-27335-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics