Skip to main content

Statistical Field Theory for Neural Networks

  • Book
  • © 2020

Overview

  • Provides the first self-contained introduction to field theory for neuronal networks
  • Presents the main concepts from field theory that are relevant for network dynamics, including diagrammatic techniques and systematic perturbative and fluctuation expansions
  • Introduces advanced concepts, like the effective action formalism, in mathematical minimal setting
  • Includes in-depth derivations of classical seminal works and recent developments, such as the dynamical mean-field theory and chaos

Part of the book series: Lecture Notes in Physics (LNP, volume 970)

This is a preview of subscription content, log in via an institution to check access.

Access this book

eBook USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (14 chapters)

Keywords

About this book

This book presents a self-contained introduction to techniques from field theory applied to stochastic and collective dynamics in neuronal networks. These powerful analytical techniques, which are well established in other fields of physics, are the basis of current developments and offer solutions to pressing open problems in theoretical neuroscience and also machine learning. They enable a systematic and quantitative understanding of the dynamics in recurrent and stochastic neuronal networks.

This book is intended for physicists, mathematicians, and computer scientists and it is designed for self-study by researchers who want to enter the field or as the main text for a one semester course at advanced undergraduate or graduate level. The theoretical concepts presented in this book are systematically developed from the very beginning, which only requires basic knowledge of analysis and linear algebra.

Authors and Affiliations

  • Institute of Neuroscience and Medicine (INM-6), Forschungszentrum Jülich, Jülich, Germany; Faculty of Physics, RWTH Aachen University, Aachen, Germany

    Moritz Helias

  • Institute of Neuroscience and Medicine (INM-6), Forschungszentrum Jülich, Jülich, Germany

    David Dahmen

About the authors

Moritz Helias is group leader at the Jülich Research Centre and assistant professor in the department of physics of the RWTH Aachen University, Germany. He obtained his diploma in theoretical solid state physics at the University of Hamburg and his PhD in computational neuroscience at the University of Freiburg, Germany. Post-doctoral positions in RIKEN Wako-Shi, Japan and Jülich Research Center followed. His main research interests are neuronal network dynamics and function, and their quantitative analysis with tools from statistical physics and field theory.


David Dahmen is a post-doctoral researcher in the Institute of Neuroscience and Medicine at the Jülich Research Centre, Germany. He obtained his Master's degree in physics from RWTH Aachen University, Germany, working on effective field theory approaches to particle physics. Afterwards he moved to the field of computational neuroscience, where he received his PhD in 2017. His research comprises modeling, analysis and simulation of recurrent neuronal networks with special focus on development and knowledge transfer of mathematical tools and simulation concepts. His main interests are field-theoretic methods for random neural networks, correlations in recurrent networks, and modeling of the local field potential.

Bibliographic Information

Publish with us