Skip to main content

Evolving Recurrent Neural Networks

  • Conference paper
Artificial Neural Nets and Genetic Algorithms

Abstract

An evolutionary algorithm which allows entities to increase and decrease in complexity during the evolutionary process is applied to recurrent neural networks. Recognition of various regular languages provides a suitable set of test problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Schaffer, J. D., Whitley, D., and Eshelman, J. L., ‘Combinations of genetic algorithms and neural networks: a survey of the state of the art’, in COGANN-92, International Workshop on Combinations of Genetic Algorithms and Neural Networks, pp. 1–37, IEEE Computer Society Press, Los Alamitos CA, 1992.

    Google Scholar 

  2. Montana D. J. and Davis, L., ‘Training feedforward neural networks using genetic algorithms’, in Proceedings of the 11th Joint International Conference on Artificial Intelligence, pp. 762-767, 1989.

    Google Scholar 

  3. Whitley, D. and Hanson, T., ‘Optimizing neural networks using faster, more accurate genetic search’, in Proceedings of the Third International Conference on Genetic Algorithms, pp. 391–396, Morgan Kauffman, San Mateo, CA, 1989.

    Google Scholar 

  4. Whitley, D., Dominic, S. and Das, R., ‘Genetic reinforcement learning with multilayer neural networks’, in Proceedings of the Fourth International Conference on Genetic Algorithms, pp. 562–569, Morgan Kauffman, San Mateo CA, 1991.

    Google Scholar 

  5. Miller, G. F., Todd, P. and Hedge, S. U., ‘Designing neural networks using genetic algorithms’, in Proceedings of the Third International Conference on Genetic Algorithms, pp. 379–384, Morgan Kauffman, San Mateo, CA, 1989.

    Google Scholar 

  6. Belew, R. K., McInerney, J. and Schraudolph, N. N., ‘Evolving networks: Using the genetic algorithm with connectionist learning’, in Artificial Life II, pp. 511–547, Addison-Wesley, Redwood City CA, 1991.

    Google Scholar 

  7. Elman, J. L., ‘Finding structure in time’, Cognitive Science, 14, 179, 1990.

    Article  Google Scholar 

  8. Williams, R. J. and Zipser, D., ‘A learning algorithm for continually running fully recurrent neural networks’, Neural Computation, 1, 270, 1989.

    Article  Google Scholar 

  9. Schmidhuber, J., ‘A fixed size storage O(n 3) time complexity learning algorithm for fully recurrent continually running networks’, Neural Computation, 4, 243, 1992.

    Article  Google Scholar 

  10. Bergmann, A., ‘Self-organization by simulated evolution’, in 1989 Lectures in Complex Systems, pp. 455–463, (Addison-Wesley, Redwood City CA, 1990).

    Google Scholar 

  11. de Garis, H., ‘Steerable GenNETS: the genetic programming of steerable behavior in GenNETS’, in Towards a practice of autonomous systems, Proceedings of the First European Conference on Artificial Life, pp. 272–281, MIT Press, Cambridge, MA, 1992.

    Google Scholar 

  12. Torreele, J., ‘Temporal processing with recurrent networks: An evolutionary approach’, in Proceedings of the Fourth International Conference on Genetic Algorithms, pp. 555–561, Morgan Kauffman, San Mateo, CA, 1991.

    Google Scholar 

  13. Lindgren, K., Nilsson, A., Nordahl. M. G., and Råde, I., ‘Regular language inference using evolving neural networks’, in COGANN-92, International Workshop on Combinations of Genetic Algorithms and Neural Networks, pp. 75–86, IEEE Computer Society Press, Los Alamitos CA, 1992.

    Google Scholar 

  14. Koza, J. R., ‘Genetic programming: On the programming of computers by means of natural selection’, MIT Press, 1992.

    Google Scholar 

  15. Goldberg, D. E., Deb, K. and Korb, B., ‘Messy genetic algorithms: Motivation, analysis, and first results’, Complex Systems, 3, 493, 1988.

    MathSciNet  Google Scholar 

  16. Fahlman, S. E. and Lebiere, C., ‘The cascade-correlation learning architecture’, in Advances in Neural Information Processing Systems, Vol. 2, pp. 524–532, Morgan Kauffman, San Mateo CA, 1990.

    Google Scholar 

  17. Frean, M., ‘The upstart algorithm: A method for constructing and training feedforward neural networks’, Neural Computation, 2, 198, 1990.

    Article  Google Scholar 

  18. Fahlman, S. E., ‘The recurrent cascade correlation architecture’, technical report CMU-CS-91-100, School of Computer Science, Carnegie Mellon University, 1991.

    Google Scholar 

  19. Elman, J. L., ‘Incremental learning, or the importance of starting small’, CRL Technical Report 9101, University of California, San Diego, 1991.

    Google Scholar 

  20. Lindgren, K., ‘Evolutionary phenomena in simple dynamics’, in Artificial Life II, pp. 295–312, Addison-Wesley, Redwood City CA, 1991.

    Google Scholar 

  21. Hopcroft, J. E. and Ullman, J. D., ‘Introduction to Automata Theory, Languages, and Computation’, Addison-Wesley, Reading MA, 1979.

    MATH  Google Scholar 

  22. Siegelmann, H. T. and Sontag, E. D., ‘On the computational power of neural nets’, technical report SYCON-91-11, Rutgers Center for Systems and Control, 1991.

    Google Scholar 

  23. Moore, C., ‘Unpredictability and undecidability in dynamical systems’, Physical Review Letters’, 64, 2354, 1990.

    Article  MathSciNet  MATH  Google Scholar 

  24. Pollack, J. B., ‘The induction of dynamical recognizers’, Machine Learning, 7, 227, 1991.

    Google Scholar 

  25. Manderick, B., de Weger, M., and Spiessens, P., The genetic algorithm and the structure of the fitness landscape, in Proceedings of the Fourth International Conference on Genetic Algorithms, pp. 143–150, Morgan Kauffman, San Mateo CA, 1991.

    Google Scholar 

  26. Angeline, P. J. and Pollack, J. B., ‘Coevolving high-level representations’, preprint.

    Google Scholar 

  27. Lindgren, K., Nilsson, A., Nordahl. M. G., and Råde, I., in preparation.

    Google Scholar 

  28. Rissanen, J., ‘Stochastic Complexity in Statistical Inquiry’, World Scientific, Singapore, 1989.

    Google Scholar 

  29. Gold, E. M., ‘Complexity of automaton identification from given data’, Information and Control, 37, 302, 1978.

    Article  MathSciNet  MATH  Google Scholar 

  30. Biermann, A. W. and Feldman, J. A., ‘On the synthesis of finite-state machines from samples of their behavior’, IEEE Trans. Comput., C-21, 592, 1972.

    Article  MathSciNet  Google Scholar 

  31. Miclet, L., Grammatical inference, in Syntactic and Structural Pattern Recognition Theory and Applications, World Scientific, Singapore, 1990.

    Google Scholar 

  32. Cleeremans, A., Servan-Schreiber, D. and McClelland, J. L., ‘Finite state automata and simple recurrent networks’, Neural Computation, 1, 372, 1989.

    Article  Google Scholar 

  33. Smith, A. W. and Zipser, D., ‘Learning sequential structure with the real-time recurrent learning algorithm’, International Journal of Neural Systems, 1, 125, 1989.

    Article  Google Scholar 

  34. Giles, C. L., Miller, C. B., Chen, D., Chen, H. H., Sun, G. Z. and Lee, Y. C., ‘Learning and extracting finite state automata with second-order recurrent neural networks’, Neural Computation, 4, 393, 1992.

    Article  Google Scholar 

  35. Watrous, R. C. and Kuhn, G. M., ‘Induction of finite-state languages using second order recurrent networks’, Neural Computation, 4, 406, 1992.

    Article  Google Scholar 

  36. Schwarz, D. B., Samalam, V. K., Solla, S. A., and Denker, J. S., ‘Exhaustive learning’, Neural Computation, 2, 374, 1990.

    Article  Google Scholar 

  37. Cohn, D. and Tesauro, G., ‘How tight are the Vapnik-Chervonenkis bounds?’, Neural Computation, 4, 249, 1992.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1993 Springer-Verlag/Wien

About this paper

Cite this paper

Lindgren, K., Nilsson, A., Nordahl, M.G., Råde, I. (1993). Evolving Recurrent Neural Networks. In: Albrecht, R.F., Reeves, C.R., Steele, N.C. (eds) Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-7533-0_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-7091-7533-0_9

  • Publisher Name: Springer, Vienna

  • Print ISBN: 978-3-211-82459-7

  • Online ISBN: 978-3-7091-7533-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics