Skip to main content

Theory and Practice of Neural Networks

  • Conference paper

Part of the book series: Informatik-Fachberichte ((2252,volume 291))

Abstract

When attempting to apply neural networks to real-world problems one is confronted with a major problem — there is no general theory about which network model to choose and how to optimally set all parameters. The large number of publications on neural networks is in strong contrast to a lack of means for comparison between, and appraisal of different systems found in literature. Furthermore, most existing applications focus on simple associative multi-layer architectures that are not suitable for many aspects of real-world problems, such as time-dependencies between inputs.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   54.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ahalt S.C., Chen P., Krishnamurthy A.K.: Performance Analysis of Two Image Vector Quantization Techniques, in Proceedings of the International Joint Conference on Neural Networks, I-169–I-175, 1989.

    Google Scholar 

  2. Dehaene S., Changeux J., Nadal J.: Neural networks that learn temporal sequences by selection, Proc. Natl Acad. Sci USA, Vol. 84, pp. 2727–2731, 1987.

    Article  MathSciNet  Google Scholar 

  3. Elman J.L.: Finding Structure in Time, Cognitive Science, 1990.

    Google Scholar 

  4. Hataoka N., Waibel A.H.: Speaker-Independent Phoneme Recognition on TIMIT Database Using Integrated Time-Delay Neural Networks (TDNNs), in International Joint Conference on Neural Networks, San Diego, IEEE, Volume I, pp. 57–62, 1990.

    Chapter  Google Scholar 

  5. Jordan M.I.: Attractor dynamics and parallelism in a connectionist sequential machine, in Proceedings of the Eight Annual Conference of the Cognitive Science Society, Erlbaum, Hillsdale, NJ, pp. 531–546, 1986.

    Google Scholar 

  6. Mannes C., Dorffner G.: Self-Organizing Detectors of Spatio-Temporal Patterns, in Kindermann J., Linden A. (eds.): Distributed Adaptive Neural Information Systems, Oldenbourg, Muenchen/Wien, pp. 89–102, 1990.

    Google Scholar 

  7. McClelland J.L., Elman J.L.: Interactive Processes in Speech Perception: The TRACE Model, in Rumelhart D.E., McClelland J.L. (eds.): Parallel Distributed Processing, Vol 1, MIT Press, Cambridge, MA, 1986.

    Google Scholar 

  8. Norrod F.E., O’Neill M.D., Gat E.: Feedback-Induced Sequentially in Neural Networks, in Caudill M., Butler C.(eds.), IEEE First International Conference On Neural Networks, San Diego, IEEE, 1987.

    Google Scholar 

  9. Prem E.: A Description Framework for Solving the “Theory Problem” in Connectionism, Master’s Thesis at the Dept. of Medical Cybernetics and Artificial Intelligence, University of Vienna, 1991.

    Google Scholar 

  10. Simpson P.K.: Artificial Neural Systems, Pergamon Press, 1990.

    Google Scholar 

  11. Smith A.W., Zipser D.: Encoding Sequential Structure: Experience with the Real-Time Re-current Learning Algorithm, in IEEE International Conference On Neural Networks, Washington D.C., IEEE, Volume I, pp. 645–648, 1989.

    Chapter  Google Scholar 

  12. Wan E.A.: Temporal Backpropagation: An Efficient Algorithm for Finite Impulse Response Neural Networks, in Touretzky D.S., et al.(eds.), Connectionist Models, Morgan Kaufmann Publishers, San Mateo, CA, pp. 131–137, 1990.

    Google Scholar 

  13. Wolf L.: Recurrent Nets for the Storage of Cyclic Sequences, in Kosko B.(ed.), IEEE International Conference On Neural Networks, San Diego, IEEE, Volume I, pp. 53–60, 1988.

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1991 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Dorffner, G., Prem, E., Ulbricht, C., Wiklicky, H. (1991). Theory and Practice of Neural Networks. In: Brauer, W., Hernández, D. (eds) Verteilte Künstliche Intelligenz und kooperatives Arbeiten. Informatik-Fachberichte, vol 291. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-76980-1_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-76980-1_45

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-54617-7

  • Online ISBN: 978-3-642-76980-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics