Journal of Statistical Physics

, Volume 153, Issue 2, pp 211–269 | Cite as

Large Deviations, Dynamics and Phase Transitions in Large Stochastic and Disordered Neural Networks

  • Tanguy Cabana
  • Jonathan Touboul


Neuronal networks are characterized by highly heterogeneous connectivity, and this disorder was recently related experimentally to qualitative properties of the network. The motivation of this paper is to mathematically analyze the role of these disordered connectivities on the large-scale properties of neuronal networks. To this end, we analyze here large-scale limit behaviors of neural networks including, for biological relevance, multiple populations, random connectivities and interaction delays. Due to the randomness of the connectivity, usual mean-field methods (e.g. coupling) cannot be applied, but, similarly to studies developed for spin glasses, we will show that the sequences of empirical measures satisfy a large deviation principle, and converge towards a self-consistent non-Markovian process. From a mathematical viewpoint, the proof differs from previous works in that we are working in infinite-dimensional spaces (interaction delays) and consider multiple cell types. The limit obtained formally characterizes the macroscopic behavior of the network. We propose a dynamical systems approach in order to address the qualitative nature of the solutions of these very complex equations, and apply this methodology to three instances in order to show how non-centered coefficients, interaction delays and multiple populations networks are affected by disorder levels. We identify a number of phase transitions in such systems upon changes in delays, connectivity patterns and dispersion, and particularly focus on the emergence of non-equilibrium states involving synchronized oscillations.


Heterogeneous neuronal networks Large deviations Mean-field equations Phase transitions 


  1. 1.
    Amari, S.: Characteristics of random nets of analog neuron-like elements. IEEE Trans. Syst. Man Cybern. 2(5), 643–657 (1972). 1972 MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Aradi, I., Soltesz, I.: Modulation of network behaviour by changes in variance in interneuronal properties. J. Physiol. 538, 227 (2002) CrossRefGoogle Scholar
  3. 3.
    Ben-Arous, G., Guionnet, A.: Large deviations for Langevin spin glass dynamics. Probab. Theory Relat. Fields 102, 455–509 (1995) MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Bressloff, P.C.: Spatiotemporal dynamics of continuum neural fields. J. Phys. A, Math. Theor. 45, 033001 (2012) MathSciNetADSCrossRefGoogle Scholar
  5. 5.
    Buice, M., Cowan, J., Chow, C.: Systematic fluctuation expansion for neural network activity equations. Neural Comput. 22, 377–426 (2010) MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Burton, R.M., Dehling, H.: Large deviations for some weakly dependent random processes. Stat. Probab. Lett. 9, 397–401 (1990) MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Buzsaki, G.: Rhythms of the Brain. Oxford University Press, London (2004) Google Scholar
  8. 8.
    Candes, E.J., Tao, T.: Near-optimal signal recovery from random projections: universal encoding strategies? IEEE Trans. Inf. Theory 52, 5406–5425 (2006) MathSciNetCrossRefGoogle Scholar
  9. 9.
    Cessac, B., Doyon, B., Quoy, M., Samuelides, M.: Mean-field equations, bifurcation map and route to chaos in discrete time neural networks. Phys. D, Nonlinear Phenom. 74, 24–44 (1994) MathSciNetADSCrossRefzbMATHGoogle Scholar
  10. 10.
    Crisanti, A., Sompolinsky, H.: Dynamics of spin systems with randomly asymmetric bounds: Ising spins and Glauber dynamics. Phys. Rev. A 37, 4865 (1987) MathSciNetADSCrossRefGoogle Scholar
  11. 11.
    Da Prato, G., Zabczyk, J.: Stochastic Equations in Infinite Dimensions. Cambridge University Press, Cambridge (1992) CrossRefzbMATHGoogle Scholar
  12. 12.
    Dauce, E., Moynot, O., Pinaud, O., Samuelides, M.: Mean-field theory and synchronization in random recurrent neural networks. Neural Process. Lett. 14, 115–126 (2001) CrossRefzbMATHGoogle Scholar
  13. 13.
    Dembo, A., Zeitouni, O.: Large Deviations Techniques and Applications, vol. 38. Springer, Berlin (2009) Google Scholar
  14. 14.
    Deuschel, J., Stroock, D.: Large Deviations, vol. 137. Academic Press, San Diego (1989) zbMATHGoogle Scholar
  15. 15.
    Dobrushin, R.: Prescribing a system of random variables by conditional distributions. Theory Probab. Appl. 15 (1970) Google Scholar
  16. 16.
    Ecker, A., Berens, P., Keliris, G., Bethge, M., Logothetis, N., Tolias, A.: Decorrelated neuronal firing in cortical microcircuits. Science 327, 584 (2010) ADSCrossRefGoogle Scholar
  17. 17.
    Faugeras, O., MacLaurin, J.: A large deviation principle for networks of rate neurons with correlated synaptic weights (2013). Preprint, arXiv:1302.1029
  18. 18.
    Faugeras, O., Touboul, J., Cessac, B.: A constructive mean-field analysis of multi population neural networks with random synaptic weights. In: COSYNE 09 (2009) Google Scholar
  19. 19.
    Guionnet, A.: Averaged and quenched propagation of chaos for spin glass dynamics. Probab. Theory Relat. Fields 109, 183–215 (1997) MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Hale, J., Lunel, S.: Introduction to Functional Differential Equations. Springer, Berlin (1993) CrossRefzbMATHGoogle Scholar
  21. 21.
    Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79, 2554–2558 (1982) MathSciNetADSCrossRefGoogle Scholar
  22. 22.
    Jaeger, H.: Echo state network. Scholarpedia 2, 2330 (2007) CrossRefADSGoogle Scholar
  23. 23.
    Kandel, E., Schwartz, J., Jessel, T.: Principles of Neural Science, 4th edn. McGraw-Hill, New York (2000) Google Scholar
  24. 24.
    Kaufman, M., Corner, M.A., Ziv, N.E.: Long-term relationships between cholinergic tone, synchronous bursting and synaptic remodeling. PLoS ONE 7, e40980 (2012) ADSCrossRefGoogle Scholar
  25. 25.
    Mao, X.: Stochastic Differential Equations and Applications. Horwood Publishing, Chichester (2008) CrossRefGoogle Scholar
  26. 26.
    Marder, E., Goaillard, J.: Variability, compensation and homeostasis in neuron and network function. Nat. Rev. Neurosci. 7, 563–574 (2006) CrossRefGoogle Scholar
  27. 27.
    Moynot, O.: Etude mathématique de la dynamique des réseaux neuronaux aléatoires récurrents. PhD thesis (2000) Google Scholar
  28. 28.
    Moynot, O., Samuelides, M.: Large deviations and mean-field theory for asymmetric random recurrent neural networks. Probab. Theory Relat. Fields 123, 41–75 (2002) MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Parker, D.: Variable properties in a single class of excitatory spinal synapse. J. Neurosci. 23, 3154–3163 (2003) Google Scholar
  30. 30.
    Rajan, K., Abbott, L.F., Sompolinsky, H.: Stimulus-dependent suppression of chaos in recurrent neural networks. Phys. Rev. E 82, 011903 (2010) ADSCrossRefGoogle Scholar
  31. 31.
    Renart, A., De la Rocha, J., Bartho, P., Hollender, L., Parga, N., Reyes, A., Harris, K.: The asynchronous state in cortical circuits. Science 327, 587 (2010) ADSCrossRefGoogle Scholar
  32. 32.
    Roxin, A., Brunel, N., Hansel, D.: Role of delays in shaping spatiotemporal dynamics of neuronal activity in large networks. Phys. Rev. Lett. 94, 238103 (2005) ADSCrossRefGoogle Scholar
  33. 33.
    Sompolinsky, H., Crisanti, A., Sommers, H.: Chaos in random neural networks. Phys. Rev. Lett. 61, 259–262 (1988) MathSciNetADSCrossRefGoogle Scholar
  34. 34.
    Sompolinsky, H., Zippelius, A.: Relaxational dynamics of the Edwards-Anderson model and the mean-field theory of spin-glasses. Phys. Rev. B 25, 6860–6875 (1982) ADSCrossRefGoogle Scholar
  35. 35.
    Sznitman, A.: Equations de type de Boltzmann, spatialement homogenes. Probab. Theory Relat. Fields 66, 559–592 (1984) MathSciNetzbMATHGoogle Scholar
  36. 36.
    Sznitman, A.: Topics in propagation of chaos. In: Ecole d’Eté de Probabilités de Saint-Flour XIX – 1989. Lecture Notes in Mathematics, vol. 1464, pp. 165–251 (1991) CrossRefGoogle Scholar
  37. 37.
    Touboul, J.: Mean-field equations for stochastic firing-rate neural fields with delays: derivation and noise-induced transitions. Phys. D, Nonlinear Phenom. 241(15), 1223–1244 (2012) MathSciNetADSCrossRefzbMATHGoogle Scholar
  38. 38.
    Touboul, J.: Propagation of chaos in neural fields. Ann. Appl. Probab. (2013) (in press) Google Scholar
  39. 39.
    Touboul, J., Hermann, G.: Heterogeneous connections induce oscillations in large-scale networks. Phys. Rev. Lett. (2012) (in press). doi: 10.1103/PhysRevLett.109.018702 Google Scholar
  40. 40.
    Touboul, J., Hermann, G., Faugeras, O.: Noise-induced behaviors in neural mean field dynamics. SIAM J. Dyn. Syst. 11 (2011). doi: 10.1137/110832392
  41. 41.
    Toyoizumi, T., Abbott, L.: Beyond the edge of chaos: amplification and temporal integration by recurrent networks in the chaotic regime. Phys. Rev. E 84, 051908 (2011) ADSCrossRefGoogle Scholar
  42. 42.
    Triller, A., Choquet, D.: New concepts in synaptic biology derived from single-molecule imaging. Neuron 59, 359–374 (2008) CrossRefGoogle Scholar
  43. 43.
    Wainrib, G., Touboul, J.: Topological and dynamical complexity of random neural networks. Phys. Rev. Lett. (2012) (in press). doi: 10.1103/PhysRevLett.110.118101 Google Scholar
  44. 44.
    Wilson, C.: Up and down states. Scholarpedia 3, 1410 (2008) CrossRefADSGoogle Scholar
  45. 45.
    Wilson, H., Cowan, J.: Excitatory and inhibitory interactions in localized populations of model neurons. Biophys. J. 12, 1–24 (1972) ADSCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.The Mathematical Neuroscience LaboratoryCollège de France/CIRB and INRIA Bang LaboratoryParisFrance

Personalised recommendations