Generalization in Learning Multiple Temporal Patterns Using RNNPB
This paper examines the generalization capability in learning multiple temporal patterns by the recurrent neural network with parametric bias (RNNPB). Our simulation experiments indicated that the RNNPB can learn multiple patterns as generalized by extracting relational structures shared among the training patterns. It was, however, shown that such generalizations cannot be achieved when the relational structures are complex. Our analysis clarified that the qualitative differences appear in the self-organized internal structures of the network between generalized cases and not-generalized ones.
Unable to display preview. Download preview PDF.
- 1.Jordan, M.: Attractor dynamics and parallelism in a connectionist sequential machine. In: Proc. of Eighth Annual Conference of Cognitive Science Society, pp. 531–546. Erlbaum, Hillsdale (1986)Google Scholar
- 3.Tani, J., Nolfi, S.: Learning to perceive the world as articulated: an approach for hierarchical learning in sensory-motor systems. In: Pfeifer, R., Blumberg, B., Meyer, J., Wilson, S. (eds.) From animals to animats 5. MIT Press, Cambridge (1998); later published in Neural Networks 12, 1131–1141 (1999)Google Scholar
- 5.Rumelhart, D., Hinton, G., Williams, R.: Learning internal representations by error propagation. In: Rumelhart, D., Mclelland, J. (eds.) Parallel Distributed Processing, pp. 318–362. MIT Press, Cambridge (1986)Google Scholar