Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks
This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical o.ine grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even from some chaotic sources. When predicting texts in human language, however, dynamics seem to be too complex to be correctly learned in real-time by the net. Two algorithms are considered for network training: real-time recurrent learning and the decoupled extended Kalman filter.
Unable to display preview. Download preview PDF.
- 1.Bengio, Y., P. Simard and P. Frasconi (1994), “Learning long-term dependencies with gradient descent is difficult”, IEEE Transactions on Neural Networks, 5(2).Google Scholar
- 2.Carrasco, R. C. et al. (2000), “Stable-encoding of finite-state machines in discrete-time recurrent neural nets with sigmoid units”, Neural Computation, 12(9).Google Scholar
- 6.Hochreiter, J. (1991), Untersuchungen zu dynamischen neuronalen Netzen, Diploma thesis, Institut für Informatik, Technische Universität München.Google Scholar
- 7.Nelson, M. (1991), “Arithmetic coding + statistical modeling = data compression”, Dr. Dobb’s Journal, Feb. 1991.Google Scholar
- 8.Pérez-Ortiz, J. A., J. Calera-Rubio, M. L. Forcada (2001), “Online text prediction with recurrent neural networks”, Neural Processing Letters, in press.Google Scholar
- 9.Puskorius, G. V. and L. A. Feldkamp (1991), ”Decoupled extended Kalman filter training of feedforward layered networks”, in International Joint Conference on Neural Networks, volume 1, pp. 771–777.Google Scholar
- 12.Smith, A. W. and D. Zipser (1989), “Learning sequential structures with the realtime recurrent learning algorithm”, International Journal of Neural Systems, 1(2).Google Scholar