Encyclopedia of Machine Learning

2010 Edition
| Editors: Claude Sammut, Geoffrey I. Webb

Simple Recurrent Network

  • Risto Miikkulainen
Reference work entry
DOI: https://doi.org/10.1007/978-0-387-30164-8_762



The simple recurrent network is a specific version of the  Backpropagation neural network that makes it possible to process of sequential input and output (Elman, 1990). It is typically a three-layer network where a copy of the hidden layer activations is saved and used (in addition to the actual input) as input to the hidden layer in the next time step. The previous hidden layer is fully connected to the hidden layer. Because the network has no recurrent connections per se (only a copy of the activation values), the entire network (including the weights from the previous hidden layer to the hidden layer) can be trained with the backpropagation algorithm as usual. It can be trained to read a sequence of inputs into a target output pattern, to generate a sequence of outputs from a given input pattern, or to map an input sequence to an output sequence (as in predicting the next input). Simple recurrent networks have been...

This is a preview of subscription content, log in to check access.

Recommended Reading

  1. Elman, J. L. (1990). Finding structure in time. Cognitive Science, 14, 179–211.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  • Risto Miikkulainen

There are no affiliations available