Abstract
An evolutionary algorithm which allows entities to increase and decrease in complexity during the evolutionary process is applied to recurrent neural networks. Recognition of various regular languages provides a suitable set of test problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Schaffer, J. D., Whitley, D., and Eshelman, J. L., ‘Combinations of genetic algorithms and neural networks: a survey of the state of the art’, in COGANN-92, International Workshop on Combinations of Genetic Algorithms and Neural Networks, pp. 1–37, IEEE Computer Society Press, Los Alamitos CA, 1992.
Montana D. J. and Davis, L., ‘Training feedforward neural networks using genetic algorithms’, in Proceedings of the 11th Joint International Conference on Artificial Intelligence, pp. 762-767, 1989.
Whitley, D. and Hanson, T., ‘Optimizing neural networks using faster, more accurate genetic search’, in Proceedings of the Third International Conference on Genetic Algorithms, pp. 391–396, Morgan Kauffman, San Mateo, CA, 1989.
Whitley, D., Dominic, S. and Das, R., ‘Genetic reinforcement learning with multilayer neural networks’, in Proceedings of the Fourth International Conference on Genetic Algorithms, pp. 562–569, Morgan Kauffman, San Mateo CA, 1991.
Miller, G. F., Todd, P. and Hedge, S. U., ‘Designing neural networks using genetic algorithms’, in Proceedings of the Third International Conference on Genetic Algorithms, pp. 379–384, Morgan Kauffman, San Mateo, CA, 1989.
Belew, R. K., McInerney, J. and Schraudolph, N. N., ‘Evolving networks: Using the genetic algorithm with connectionist learning’, in Artificial Life II, pp. 511–547, Addison-Wesley, Redwood City CA, 1991.
Elman, J. L., ‘Finding structure in time’, Cognitive Science, 14, 179, 1990.
Williams, R. J. and Zipser, D., ‘A learning algorithm for continually running fully recurrent neural networks’, Neural Computation, 1, 270, 1989.
Schmidhuber, J., ‘A fixed size storage O(n 3) time complexity learning algorithm for fully recurrent continually running networks’, Neural Computation, 4, 243, 1992.
Bergmann, A., ‘Self-organization by simulated evolution’, in 1989 Lectures in Complex Systems, pp. 455–463, (Addison-Wesley, Redwood City CA, 1990).
de Garis, H., ‘Steerable GenNETS: the genetic programming of steerable behavior in GenNETS’, in Towards a practice of autonomous systems, Proceedings of the First European Conference on Artificial Life, pp. 272–281, MIT Press, Cambridge, MA, 1992.
Torreele, J., ‘Temporal processing with recurrent networks: An evolutionary approach’, in Proceedings of the Fourth International Conference on Genetic Algorithms, pp. 555–561, Morgan Kauffman, San Mateo, CA, 1991.
Lindgren, K., Nilsson, A., Nordahl. M. G., and Råde, I., ‘Regular language inference using evolving neural networks’, in COGANN-92, International Workshop on Combinations of Genetic Algorithms and Neural Networks, pp. 75–86, IEEE Computer Society Press, Los Alamitos CA, 1992.
Koza, J. R., ‘Genetic programming: On the programming of computers by means of natural selection’, MIT Press, 1992.
Goldberg, D. E., Deb, K. and Korb, B., ‘Messy genetic algorithms: Motivation, analysis, and first results’, Complex Systems, 3, 493, 1988.
Fahlman, S. E. and Lebiere, C., ‘The cascade-correlation learning architecture’, in Advances in Neural Information Processing Systems, Vol. 2, pp. 524–532, Morgan Kauffman, San Mateo CA, 1990.
Frean, M., ‘The upstart algorithm: A method for constructing and training feedforward neural networks’, Neural Computation, 2, 198, 1990.
Fahlman, S. E., ‘The recurrent cascade correlation architecture’, technical report CMU-CS-91-100, School of Computer Science, Carnegie Mellon University, 1991.
Elman, J. L., ‘Incremental learning, or the importance of starting small’, CRL Technical Report 9101, University of California, San Diego, 1991.
Lindgren, K., ‘Evolutionary phenomena in simple dynamics’, in Artificial Life II, pp. 295–312, Addison-Wesley, Redwood City CA, 1991.
Hopcroft, J. E. and Ullman, J. D., ‘Introduction to Automata Theory, Languages, and Computation’, Addison-Wesley, Reading MA, 1979.
Siegelmann, H. T. and Sontag, E. D., ‘On the computational power of neural nets’, technical report SYCON-91-11, Rutgers Center for Systems and Control, 1991.
Moore, C., ‘Unpredictability and undecidability in dynamical systems’, Physical Review Letters’, 64, 2354, 1990.
Pollack, J. B., ‘The induction of dynamical recognizers’, Machine Learning, 7, 227, 1991.
Manderick, B., de Weger, M., and Spiessens, P., The genetic algorithm and the structure of the fitness landscape, in Proceedings of the Fourth International Conference on Genetic Algorithms, pp. 143–150, Morgan Kauffman, San Mateo CA, 1991.
Angeline, P. J. and Pollack, J. B., ‘Coevolving high-level representations’, preprint.
Lindgren, K., Nilsson, A., Nordahl. M. G., and Råde, I., in preparation.
Rissanen, J., ‘Stochastic Complexity in Statistical Inquiry’, World Scientific, Singapore, 1989.
Gold, E. M., ‘Complexity of automaton identification from given data’, Information and Control, 37, 302, 1978.
Biermann, A. W. and Feldman, J. A., ‘On the synthesis of finite-state machines from samples of their behavior’, IEEE Trans. Comput., C-21, 592, 1972.
Miclet, L., Grammatical inference, in Syntactic and Structural Pattern Recognition Theory and Applications, World Scientific, Singapore, 1990.
Cleeremans, A., Servan-Schreiber, D. and McClelland, J. L., ‘Finite state automata and simple recurrent networks’, Neural Computation, 1, 372, 1989.
Smith, A. W. and Zipser, D., ‘Learning sequential structure with the real-time recurrent learning algorithm’, International Journal of Neural Systems, 1, 125, 1989.
Giles, C. L., Miller, C. B., Chen, D., Chen, H. H., Sun, G. Z. and Lee, Y. C., ‘Learning and extracting finite state automata with second-order recurrent neural networks’, Neural Computation, 4, 393, 1992.
Watrous, R. C. and Kuhn, G. M., ‘Induction of finite-state languages using second order recurrent networks’, Neural Computation, 4, 406, 1992.
Schwarz, D. B., Samalam, V. K., Solla, S. A., and Denker, J. S., ‘Exhaustive learning’, Neural Computation, 2, 374, 1990.
Cohn, D. and Tesauro, G., ‘How tight are the Vapnik-Chervonenkis bounds?’, Neural Computation, 4, 249, 1992.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1993 Springer-Verlag/Wien
About this paper
Cite this paper
Lindgren, K., Nilsson, A., Nordahl, M.G., Råde, I. (1993). Evolving Recurrent Neural Networks. In: Albrecht, R.F., Reeves, C.R., Steele, N.C. (eds) Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-7533-0_9
Download citation
DOI: https://doi.org/10.1007/978-3-7091-7533-0_9
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-82459-7
Online ISBN: 978-3-7091-7533-0
eBook Packages: Springer Book Archive