Theoretical Advances in Neural Computation and Learning

  • Vwani Roychowdhury
  • Kai-Yeung Siu
  • Alon Orlitsky

Table of contents

  1. Front Matter
    Pages i-xxiv
  2. Computational Complexity of Neural Networks

    1. Front Matter
      Pages 1-1
    2. Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky
      Pages 3-36
    3. Santosh S. Venkatesh
      Pages 173-240
  3. Learning and Neural Networks

    1. Front Matter
      Pages 241-241
    2. Bhaskar DasGupta, Hava T. Siegelmann, Eduardo Sontag
      Pages 357-389
    3. Babak Hassibi, Ali H. Sayed, Thomas Kailath
      Pages 425-447
  4. Back Matter
    Pages 463-468

About this book

Introduction

For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the founda­ tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for studying such machines comes from the fact that the brain is far more efficient in visual processing and speech recognition than existing computers. Undoubtedly, neu­ robiological systems employ very different computational principles. The study of artificial neural networks aims at understanding these computational prin­ ciples and applying them in the solutions of engineering problems. Due to the recent advances in both device technology and computational science, we are currently witnessing an explosive growth in the studies of neural networks and their applications. It may take many years before we have a complete understanding about the mechanisms of neural systems. Before this ultimate goal can be achieved, an­ swers are needed to important fundamental questions such as (a) what can neu­ ral networks do that traditional computing techniques cannot, (b) how does the complexity of the network for an application relate to the complexity of that problem, and (c) how much training data are required for the resulting network to learn properly? Everyone working in the field has attempted to answer these questions, but general solutions remain elusive. However, encouraging progress in studying specific neural models has been made by researchers from various disciplines.

Keywords

algorithms artificial neural network backpropagation communication complexity electrical engineering filter learning learning theory neural network

Editors and affiliations

  • Vwani Roychowdhury
    • 1
  • Kai-Yeung Siu
    • 2
  • Alon Orlitsky
    • 3
  1. 1.Purdue UniversityWest LafayetteUSA
  2. 2.University of CaliforniaIrvineUSA
  3. 3.AT&T Bell LaboratoriesNew JerseyUSA

Bibliographic information

  • DOI https://doi.org/10.1007/978-1-4615-2696-4
  • Copyright Information Kluwer Academic Publishers 1994
  • Publisher Name Springer, Boston, MA
  • eBook Packages Springer Book Archive
  • Print ISBN 978-1-4613-6160-2
  • Online ISBN 978-1-4615-2696-4
  • About this book