Preprint FZJ-2020-03976

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Large Deviation Approach to Random Recurrent Neuronal Networks: Rate Function, Parameter Inference, and Activity Prediction

 ;  ;

2020

This record in other databases:

Please use a persistent id in citations:

Abstract: Statistical field theory captures collective non-equilibrium dynamics of neuronal networks, but it does not address the inverse problem of searching the connectivity to implement a desired dynamics. We here show for an analytically solvable network model that the effective action in statistical field theory is identical to the rate function in large deviation theory; using field theoretical methods we derive this rate function. It takes the form of a Kullback-Leibler divergence and enables data-driven inference of model parameters and Bayesian prediction of time series.


Contributing Institute(s):
  1. Computational and Systems Neuroscience (INM-6)
  2. Theoretical Neuroscience (IAS-6)
  3. Jara-Institut Brain structure-function relationships (INM-10)
Research Program(s):
  1. 574 - Theory, modelling and simulation (POF3-574) (POF3-574)
  2. 571 - Connectivity and Activity (POF3-571) (POF3-571)
  3. HBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907) (785907)
  4. MSNN - Theory of multi-scale neuronal networks (HGF-SMHB-2014-2018) (HGF-SMHB-2014-2018)
  5. PhD no Grant - Doktorand ohne besondere Förderung (PHD-NO-GRANT-20170405) (PHD-NO-GRANT-20170405)

Appears in the scientific report 2020
Database coverage:
OpenAccess
Click to display QR Code for this record

The record appears in these collections:
Institute Collections > INM > INM-10
Institute Collections > IAS > IAS-6
Institute Collections > INM > INM-6
Document types > Reports > Preprints
Workflow collections > Public records
Publications database
Open Access

 Record created 2020-10-14, last modified 2024-03-13