001     890928
005     20240313094954.0
037 _ _ |a FZJ-2021-01253
100 1 _ |a Helias, Moritz
|0 P:(DE-Juel1)144806
|b 0
|e Corresponding author
|u fzj
111 2 _ |a MILA Seminar
|c online
|g MILA
|w Canada
245 _ _ |a Fluctuations, correlations, chaos: dynamics and computation in recurrent networks
|f 2021-02-16 -
260 _ _ |c 2021
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a Other
|2 DataCite
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a LECTURE_SPEECH
|2 ORCID
336 7 _ |a Talk (non-conference)
|b talk
|m talk
|0 PUB:(DE-HGF)31
|s 1628513627_7928
|2 PUB:(DE-HGF)
|x Invited
336 7 _ |a Other
|2 DINI
520 _ _ |a The remarkable properties of information-processing by biological and artificial neuronal networks arise from the interaction of large numbers of neurons. A central quest is thus to characterize their collective states. The directed coupling between pairs of neurons and their continuous dissipation of energy, moreover, cause dynamics of neuronal networks outside thermodynamic equilibrium. Tools from non-equilibrium statistical mechanics and field theory are thus useful to obtain a quantitative understanding. We here present recent progress using such approaches [1].We show how activity in large, random networks can be described by a unified approach of path-integrals and large deviation theory that allows the inference of parameters from data and the prediction of future activity [2]. This approach also allows one to quantify fluctuations around the mean-field theory. These are important to understand why correlations observed between pairs of neurons indicate dynamics of cortical networks that are poised near a critical point [3]. Close to this transition, we find chaotic dynamics and prolonged sequential memory for past signals [4]. In the chaotic regime, networks offer representations of information whose dimensionality expands with time. We show how this mechanism aids classification performance [5]. Performance in such settings of reservoir computing, moreover, sensitively depends on the way information is fed into the network. Formally unrolling recurrence with the help of Green‘s functions yields a controlled practical method to optimize reservoir computing [6].Together these works illustrate the fruitful interplay between theoretical physics, neuronal networks, and neural information processing.References: 1. Helias, Dahmen (2020) Statistical field theory for neural networks. Springer lecture notes in physics.2. Meegen, Kuehn, Helias (2020) Large Deviation Approach to Random Recurrent Neuronal Networks: Rate Function, Parameter Inference, and Activity Prediction arXiv:2009.088893. Dahmen, Grün, Diesmann, Helias (2019). Second type of criticality in the brain uncovers rich multiple-neuron dynamics. PNAS 116 (26) 13051-130604. Schuecker J, Goedeke S, Helias M (2018). Optimal sequence memory in driven random networks. Phys Rev X 8, 0410295. Keup, Kuehn, Dahmen, Helias (2020) Transient chaotic dimensionality expansion by recurrent networks. arXiv:2002.110066. Nestler, Keup, Dahmen, Gilson, Rauhut, Helias (2020) Unfolding recurrence by Green's functions for optimized reservoir computing. In Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
536 _ _ |a 5231 - Neuroscientific Foundations (POF4-523)
|0 G:(DE-HGF)POF4-5231
|c POF4-523
|f POF IV
|x 0
536 _ _ |a 5232 - Computational Principles (POF4-523)
|0 G:(DE-HGF)POF4-5232
|c POF4-523
|f POF IV
|x 1
536 _ _ |a 5234 - Emerging NC Architectures (POF4-523)
|0 G:(DE-HGF)POF4-5234
|c POF4-523
|f POF IV
|x 2
536 _ _ |a MSNN - Theory of multi-scale neuronal networks (HGF-SMHB-2014-2018)
|0 G:(DE-Juel1)HGF-SMHB-2014-2018
|c HGF-SMHB-2014-2018
|f MSNN
|x 3
536 _ _ |a HBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907)
|0 G:(EU-Grant)785907
|c 785907
|f H2020-SGA-FETFLAG-HBP-2017
|x 4
536 _ _ |a HBP SGA3 - Human Brain Project Specific Grant Agreement 3 (945539)
|0 G:(EU-Grant)945539
|c 945539
|x 5
536 _ _ |a neuroIC002 - Recurrence and stochasticity for neuro-inspired computation (EXS-SF-neuroIC002)
|0 G:(DE-82)EXS-SF-neuroIC002
|c EXS-SF-neuroIC002
|x 6
536 _ _ |a RenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)
|0 G:(DE-Juel-1)BMBF-01IS19077A
|c BMBF-01IS19077A
|x 7
536 _ _ |a Advanced Computing Architectures (aca_20190115)
|0 G:(DE-Juel1)aca_20190115
|c aca_20190115
|f Advanced Computing Architectures
|x 8
536 _ _ |a SDS005 - Towards an integrated data science of complex natural systems (PF-JARA-SDS005)
|0 G:(DE-Juel-1)PF-JARA-SDS005
|c PF-JARA-SDS005
|x 9
700 1 _ |a van Meegen, Alexander
|0 P:(DE-Juel1)173607
|b 1
|u fzj
700 1 _ |a Dahmen, David
|0 P:(DE-Juel1)156459
|b 2
|u fzj
700 1 _ |a Keup, Christian
|0 P:(DE-Juel1)171384
|b 3
|u fzj
700 1 _ |a Nestler, Sandra
|0 P:(DE-Juel1)174585
|b 4
|u fzj
909 C O |o oai:juser.fz-juelich.de:890928
|p openaire
|p VDB
|p ec_fundedresources
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)144806
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)173607
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)156459
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 3
|6 P:(DE-Juel1)171384
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 4
|6 P:(DE-Juel1)174585
913 0 _ |a DE-HGF
|b Key Technologies
|l Decoding the Human Brain
|1 G:(DE-HGF)POF3-570
|0 G:(DE-HGF)POF3-571
|3 G:(DE-HGF)POF3
|2 G:(DE-HGF)POF3-500
|4 G:(DE-HGF)POF
|v Connectivity and Activity
|x 0
913 0 _ |a DE-HGF
|b Key Technologies
|l Decoding the Human Brain
|1 G:(DE-HGF)POF3-570
|0 G:(DE-HGF)POF3-574
|3 G:(DE-HGF)POF3
|2 G:(DE-HGF)POF3-500
|4 G:(DE-HGF)POF
|v Theory, modelling and simulation
|x 1
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5231
|x 0
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5232
|x 1
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5234
|x 2
914 1 _ |y 2021
920 1 _ |0 I:(DE-Juel1)INM-6-20090406
|k INM-6
|l Computational and Systems Neuroscience
|x 0
920 1 _ |0 I:(DE-Juel1)INM-10-20170113
|k INM-10
|l Jara-Institut Brain structure-function relationships
|x 1
920 1 _ |0 I:(DE-Juel1)IAS-6-20130828
|k IAS-6
|l Theoretical Neuroscience
|x 2
980 _ _ |a talk
980 _ _ |a VDB
980 _ _ |a I:(DE-Juel1)INM-6-20090406
980 _ _ |a I:(DE-Juel1)INM-10-20170113
980 _ _ |a I:(DE-Juel1)IAS-6-20130828
980 _ _ |a UNRESTRICTED
981 _ _ |a I:(DE-Juel1)IAS-6-20130828


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21