Learning with recurrent neural networks

  • Authors
  • Barbara¬†Hammer

Part of the Lecture Notes in Control and Information Sciences book series (LNCIS, volume 254)

Table of contents

  1. Front Matter
    Pages I-X
  2. Barbara Hammer
    Pages 1-4
  3. Barbara Hammer
    Pages 5-18
  4. Barbara Hammer
    Pages 19-49
  5. Barbara Hammer
    Pages 51-101
  6. Barbara Hammer
    Pages 103-131
  7. Barbara Hammer
    Pages 133-135
  8. Back Matter
    Pages 137-150

About this book

Introduction

Folding networks, a generalisation of recurrent neural networks to tree structured inputs, are investigated as a mechanism to learn regularities on classical symbolic data, for example. The architecture, the training mechanism, and several applications in different areas are explained. Afterwards a theoretical foundation, proving that the approach is appropriate as a learning mechanism in principle, is presented: Their universal approximation ability is investigated- including several new results for standard recurrent neural networks such as explicit bounds on the required number of neurons and the super Turing capability of sigmoidal recurrent networks. The information theoretical learnability is examined - including several contribution to distribution dependent learnability, an answer to an open question posed by Vidyasagar, and a generalisation of the recent luckiness framework to function classes. Finally, the complexity of training is considered - including new results on the loading problem for standard feedforward networks with an arbitrary multilayered architecture, a correlated number of neurons and training set size, a varying number of hidden neurons but fixed input dimension, or the sigmoidal activation function, respectively.

Keywords

Approximate capability Folding networks Learnability artificial intelligence artificial neural networks neural networks

Bibliographic information

  • DOI https://doi.org/10.1007/BFb0110016
  • Copyright Information Springer-Verlag London 2000
  • Publisher Name Springer, London
  • eBook Packages Springer Book Archive
  • Print ISBN 978-1-85233-343-0
  • Online ISBN 978-1-84628-567-7
  • Series Print ISSN 0170-8643
  • Series Online ISSN 1610-7411
  • About this book