Skip to main content

Capacity of Memory and Error Correction Capability in Chaotic Neural Networks with Incremental Learning

  • Chapter
Computer and Information Science 2009

Part of the book series: Studies in Computational Intelligence ((SCI,volume 208))

Abstract

Neural networks are able to learn more patterns with the incremental learning than with the correlative learning. The incremental learning is a method to compose an associative memory using a chaotic neural network. In the former work, it was found that the capacity of the network increases along with its size, with some threshold value and that it decreases over that size. The threshold value and the capacity varied by two different learning parameters. In this paper, the capacity of the networks was investigated by changing the learning parameter. Through the computer simulations, it turned out that the capacity also increases in proportion to the network size and that the capacity of the network with the incremental learning is above 11 times larger than the one with correlative learning. The error correction capability is also estimated in 100 neuron network.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Asakawa, S., Deguchi, T., Ishii, N.: On-Demand Learning in Neural Network. In: Proc. of the ACIS 2nd Intl. Conf. on Software Engineering, Artificial Intelligence, Networking & Parallel/Distributed Computing, pp. 84–89 (2001)

    Google Scholar 

  2. Deguchi, T., Ishii, N.: On Refractory Parameter of Chaotic Neurons in Incremental Learning. In: Negoita, M.G., Howlett, R.J., Jain, L.C. (eds.) KES 2004. LNCS, vol. 3214, pp. 103–109. Springer, Heidelberg (2004)

    Google Scholar 

  3. Watanabe, M., Aihara, K., Kondo, S.: Automatic learning in chaotic neural networks. In: Proc. of 1994 IEEE symposium on emerging technologies and factory automation, pp. 245–248 (1994)

    Google Scholar 

  4. Aihara, K., Tanabe, T., Toyoda, M.: Chaotic neural networks. Phys. Lett. A 144(6,7), 333–340 (1990)

    Article  MathSciNet  Google Scholar 

  5. Deguchi, T., Sakai, T., Ishii, N.: On storage capacity of chaotic neural networks with incremental learning. Memoirs of Gifu national college of technology (40), pp. 59–62 (in Japanese) (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Deguchi, T., Matsuno, K., Kimura, T., Ishii, N. (2009). Capacity of Memory and Error Correction Capability in Chaotic Neural Networks with Incremental Learning. In: Lee, R., Hu, G., Miao, H. (eds) Computer and Information Science 2009. Studies in Computational Intelligence, vol 208. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01209-9_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-01209-9_27

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-01208-2

  • Online ISBN: 978-3-642-01209-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics