Advertisement

Delayed Learning on Internal Memory Network and Organizing Internal States

  • Toshinori Deguchi
  • Naohiro Ishii
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)

Abstract

Elman presented a network with a context layer for the time-series processing. The context layer is connected to the hidden layer for the next calculation of the time series, which keeps the output of the hidden layer. In this paper, the context layer is reformed to the internal memory layer, which is connected from the hidden layer with the connection weights to make the internal memory. Then, the internal memory plays an important role of the learning of the time series. We developed a new learning algorithm, called the time-delayed back-propagation learning, for the internal memory. The ability of the network with the internal memory layer is demonstrated by applying the simple sinusoidal time-series.

Keywords

Hide Layer Output Layer Connection Weight Internal Memory Time Series Prediction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Elman, J.L.: Finding Structure in Time. Cognitive science 14, 179–211 (1990)CrossRefGoogle Scholar
  2. 2.
    Elman, J.L.: Learning and Development in Neural Networks: The Importance of Starting Small. Cognition 48, 71–99 (1993)CrossRefGoogle Scholar
  3. 3.
    Koskela, T., Lehtokangas, M., Saarinen, J., Kaski, K.: Time Series Prediction with Multilayer Perceptron, FIR and Elman Neural Networks. In: Proc. of the World Congress on Neural Networks, pp. 491–496. INNS Press, San Diego (1996)Google Scholar
  4. 4.
    Cholewo, T.J., Zurada, J.M.: Sequential Network Construction for Time Series Prediction. In: Proc. of the IEEE Intl. Joint Conf. on Neural Networks, Houston, Texas, USA, pp. 2034–2039 (1997)Google Scholar
  5. 5.
    Giles, C.L., Lawrence, S., Tsoi, A.: Noisy Time Series Prediction Using a Recurrent Neural Network and Grammatical Inference. In: Machine learning, vol. 44(1/2), pp. 161–183. Springer Science+Business Media B.V, Formerly Kluwer Academic Publishers B.V, Boston (2001)Google Scholar
  6. 6.
    Iwasa, K., Deguchi, T., Ishii, N.: Acquisition of the Time-Series Information in the Network with Internal Memory (in Japanese). IEICE Technical Report NC2001–71, 7–12 (2001)Google Scholar
  7. 7.
    Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. 1, pp. 318–362. MIT Press, Cambridge (1986)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Toshinori Deguchi
    • 1
  • Naohiro Ishii
    • 2
  1. 1.Gifu National College of TechnologyGifuJapan
  2. 2.Aichi Institute of TechnologyAichiJapan

Personalised recommendations