Skip to main content

Advertisement

SpringerLink
Log in
Menu
Find a journal Publish with us
Search
Cart
  1. Home
  2. Probability Theory and Related Fields
  3. Article
Large deviations and mean-field theory for asymmetric random recurrent neural networks
Download PDF
Download PDF
  • Published: May 2002

Large deviations and mean-field theory for asymmetric random recurrent neural networks

  • Olivier Moynot1 &
  • Manuel Samuelides2 

Probability Theory and Related Fields volume 123, pages 41–75 (2002)Cite this article

  • 424 Accesses

  • 31 Citations

  • Metrics details

Abstract

 In this article, we study the asymptotic dynamics of a noisy discrete time neural network, with random asymmetric couplings and thresholds. More precisely, we focus our interest on the limit behaviour of the network when its size grows to infinity with bounded time. In the case of gaussian connection weights, we use the same techniques as Ben Arous and Guionnet (see [3]) to prove that the image law of the distribution of the neurons' activation states by the empirical measure satisfies a temperature free large deviation principle. Moreover, we prove that if the connection weights satisfy a general condition of domination by gaussian tails, then the distribution of the activation potential of each neuron converges weakly towards an explicit gaussian law, the characteristics of which are contained in the mean-field equations stated by Cessac-Doyon-Quoy-Samuelides (see [4–6]). Furthermore, under this hypothesis, we obtain a law of large numbers and a propagation of chaos result. Finally, we show that many classical distributions on the couplings fulfill our general condition. Thus, this paper provides rigorous mean-field results for a large class of neural networks which is currently investigated in neural network literature.

Download to read the full article text

Working on a manuscript?

Avoid the common mistakes

Author information

Authors and Affiliations

  1. Laboratoire de Statistiques et de Probabilités, Université Paul Sabatier, 118 route de Narbonne, 31062 Toulouse Cedex, France, , , , , , FR

    Olivier Moynot

  2. Office National d'Etudes et de Recherches Aérospatiales, 2 av. E. Belin, 31055 Toulouse Cedex, France. e-mail: Manuel.Samuelides@wanadoo.fr, , , , , , FR

    Manuel Samuelides

Authors
  1. Olivier Moynot
    View author publications

    You can also search for this author in PubMed Google Scholar

  2. Manuel Samuelides
    View author publications

    You can also search for this author in PubMed Google Scholar

Additional information

Received: 10 January 2000 / Revised version: 15 June 2001 / Published online: 13 May 2002

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Moynot, O., Samuelides, M. Large deviations and mean-field theory for asymmetric random recurrent neural networks. Probab Theory Relat Fields 123, 41–75 (2002). https://doi.org/10.1007/s004400100182

Download citation

  • Issue Date: May 2002

  • DOI: https://doi.org/10.1007/s004400100182

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Neural Network
  • Connection Weight
  • Recurrent Neural Network
  • Classical Distribution
  • Deviation Principle
Download PDF

Working on a manuscript?

Avoid the common mistakes

Advertisement

Search

Navigation

  • Find a journal
  • Publish with us

Discover content

  • Journals A-Z
  • Books A-Z

Publish with us

  • Publish your research
  • Open access publishing

Products and services

  • Our products
  • Librarians
  • Societies
  • Partners and advertisers

Our imprints

  • Springer
  • Nature Portfolio
  • BMC
  • Palgrave Macmillan
  • Apress
  • Your US state privacy rights
  • Accessibility statement
  • Terms and conditions
  • Privacy policy
  • Help and support

167.114.118.210

Not affiliated

Springer Nature

© 2023 Springer Nature