Neural Network Generating Hidden Markov Chain
In this paper we introduce technique how a neural network can generate a Hidden Markov Chain. We use neural network called Temporal Information Categorizing and Learning Map. The network is an enhanced version of standard Categorizing and Learning Module (CALM). Our modifications include Euclidean metrics instead of weighted sum formerly used for categorization of the input space. Construction of the Hidden Markov Chain is provided by turning steady weight internal synapses to associative learning synapses. Result obtained from testing on simple artificial data promises applicability in a real problem domain. We present a visualization technique of the obtained Hidden Markov Chain and the method how the results can be validated. Experiments are being performed.
KeywordsNeural Network Hide Markov Model Input Vector Input Space Arousal Process
Unable to display preview. Download preview PDF.
- Rabiner L.R., A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition, in Proceedings of the IEEE, vol.77, No.2, Feb 1989.Google Scholar
- Morita M., Oliveira L. S., Sabourin R., Bortolozzi F., Suen C. Y., An HMM-MLP Hybrid System to Recognize Handwritten Dates. International Joint Conference on Neural Networks, (ICJNN’02), pp. 867–872, Honolulu-USA, May 12–17, 2002.Google Scholar
- Koutník J., Šorek M., Single Categorizing and Learning Module for Temporal Sequences, in Proceedings of the International Joint Conference on Neural Networks (IJCNN’04), Budapest, July 25–29, 2004.Google Scholar