Recognizing Connected Digit Strings Using Neural Networks

  • Łukasz Brocki
  • Danijel Koržinek
  • Krzysztof Marasek
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4188)


This paper discusses the usage of feed-forward and recurrent Artificial Neural Networks (ANNs) in whole word speech recognition. The Long-Short Term Memory (LSTM) network has been trained to do speaker independent recognition of any series of connected digits in polish language, using only the acoustic features extracted from speech. It is also shown how to effectively change the analog network output into binary information on recognized words. The parametrs of the conversion are fine-tuned using artificial evolution.


Hide Markov Model Word Recognition Speech Recognition Memory Cell Recurrent Neural Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Gers, F.: Long Short-Term Memory in Recurrent Neural Networks, PhD thesis (2001)Google Scholar
  2. 2.
    Graves, A., Schmidthuber, J.: Framewise Phoneme Classification with Bidirectional LSTM and Other Neural Network Architectures. Journal of Neural Networks, 602–610 (June/July 2005)Google Scholar
  3. 3.
    Graves, A., Eck, D., Beringer, N., Schmidthuber, J.: Biologically Plausible Speech Recognition with LSTM Neural Nets. In: Proceedings of the First International Workshop on Biologically Inspired Approaches to Advanced Information Technology, Bio-ADIT 2004, Lausanne, Switzerland, January 2004, pp. 175–184 (2004)Google Scholar
  4. 4.
    Hochreiter, S., Schmidhuber, J.: Long Short-Term Memory. Neural Computation 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  5. 5.
    Hochreiter, S., Bengio, Y., Frasconi, P., Schmidhuber, J.: Gradient flow in recurrent nets: the difficulty of learning long-term dependencies. In: Kremer, S.C., Kolen, J.F. (eds.) A Field Guide to Dynamical Recurrent Neural Networks. IEEE Press, Los Alamitos (2001)Google Scholar
  6. 6.
    Michalewicz, Z.: Genetic algorithms + Data Structures = Evolution Programs. Springer, Heidelberg (1994)MATHGoogle Scholar
  7. 7.
    Michalewicz, Z., Fogel, D.B.: How to Solve It: Modern Heuristics. Springer, Heidelberg (1999)Google Scholar
  8. 8.
    Rabiner, L.R.: A tutorial on hidden markov models and selected applications in speech recognition. Readings in speech recognition, 267–296 (1990)Google Scholar
  9. 9.
    Young, S.: The HTK Book. Cambridge University Press, Cambridge (1995)Google Scholar
  10. 10.
    Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)CrossRefGoogle Scholar
  11. 11.
    Williams, R., Zipser, D.: A learning algorithm for continually running fully recurrent neural networks. Neural Computation 1(2), 270–280 (1989)CrossRefGoogle Scholar
  12. 12.

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Łukasz Brocki
    • 1
  • Danijel Koržinek
    • 1
  • Krzysztof Marasek
    • 1
  1. 1.Polish-Japanese Institute of Information TechnologyWarsawPoland

Personalised recommendations