Connectionist Approaches to Language Learning

  • David Touretzky

Table of contents

  1. Front Matter
    Pages i-iv
  2. David S. Touretzky
    Pages 1-3
  3. Sara Porat, Jerome A. Feldman
    Pages 5-34
  4. David Servan-Schreiber, Axel Cleeremans, James L. McClelland
    Pages 57-89
  5. Jordan B. Pollack
    Pages 123-148
  6. Back Matter
    Pages 149-149

About this book


arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for study­ ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedding produce state space trajectories with an intriguing self­ similar structure. The development and shape of a recurrent network's state space is the subject of Pollack's paper, the most provocative in this collection. Pollack looks more closely at a connectionist network as a continuous dynamical system. He describes a new type of machine learning phenomenon: induction by phase transition. He then shows that under certain conditions, the state space created by these machines can have a fractal or chaotic structure, with a potentially infinite number of states. This is graphically illustrated using a higher-order recurrent network trained to recognize various regular languages over binary strings. Finally, Pollack suggests that it might be possible to exploit the fractal dynamics of these systems to achieve a generative capacity beyond that of finite-state machines.


automata behavior dynamical systems learning machine learning networks

Editors and affiliations

  • David Touretzky
    • 1
  1. 1.Carnegie Mellon UniversityUSA

Bibliographic information

  • DOI
  • Copyright Information Kluwer Academic Publishers 1991
  • Publisher Name Springer, Boston, MA
  • eBook Packages Springer Book Archive
  • Print ISBN 978-1-4613-6792-5
  • Online ISBN 978-1-4615-4008-3
  • Series Print ISSN 0893-3405
  • Buy this book on publisher's site