000890928 001__ 890928
000890928 005__ 20240313094954.0
000890928 037__ $$aFZJ-2021-01253
000890928 1001_ $$0P:(DE-Juel1)144806$$aHelias, Moritz$$b0$$eCorresponding author$$ufzj
000890928 1112_ $$aMILA Seminar$$conline$$gMILA$$wCanada
000890928 245__ $$aFluctuations, correlations, chaos: dynamics and computation in recurrent networks$$f2021-02-16 -
000890928 260__ $$c2021
000890928 3367_ $$033$$2EndNote$$aConference Paper
000890928 3367_ $$2DataCite$$aOther
000890928 3367_ $$2BibTeX$$aINPROCEEDINGS
000890928 3367_ $$2ORCID$$aLECTURE_SPEECH
000890928 3367_ $$0PUB:(DE-HGF)31$$2PUB:(DE-HGF)$$aTalk (non-conference)$$btalk$$mtalk$$s1628513627_7928$$xInvited
000890928 3367_ $$2DINI$$aOther
000890928 520__ $$aThe remarkable properties of information-processing by biological and artificial neuronal networks arise from the interaction of large numbers of neurons. A central quest is thus to characterize their collective states. The directed coupling between pairs of neurons and their continuous dissipation of energy, moreover, cause dynamics of neuronal networks outside thermodynamic equilibrium. Tools from non-equilibrium statistical mechanics and field theory are thus useful to obtain a quantitative understanding. We here present recent progress using such approaches [1].We show how activity in large, random networks can be described by a unified approach of path-integrals and large deviation theory that allows the inference of parameters from data and the prediction of future activity [2]. This approach also allows one to quantify fluctuations around the mean-field theory. These are important to understand why correlations observed between pairs of neurons indicate dynamics of cortical networks that are poised near a critical point [3]. Close to this transition, we find chaotic dynamics and prolonged sequential memory for past signals [4]. In the chaotic regime, networks offer representations of information whose dimensionality expands with time. We show how this mechanism aids classification performance [5]. Performance in such settings of reservoir computing, moreover, sensitively depends on the way information is fed into the network. Formally unrolling recurrence with the help of Green‘s functions yields a controlled practical method to optimize reservoir computing [6].Together these works illustrate the fruitful interplay between theoretical physics, neuronal networks, and neural information processing.References: 1. Helias, Dahmen (2020) Statistical field theory for neural networks. Springer lecture notes in physics.2. Meegen, Kuehn, Helias (2020) Large Deviation Approach to Random Recurrent Neuronal Networks: Rate Function, Parameter Inference, and Activity Prediction arXiv:2009.088893. Dahmen, Grün, Diesmann, Helias (2019). Second type of criticality in the brain uncovers rich multiple-neuron dynamics. PNAS 116 (26) 13051-130604. Schuecker J, Goedeke S, Helias M (2018). Optimal sequence memory in driven random networks. Phys Rev X 8, 0410295. Keup, Kuehn, Dahmen, Helias (2020) Transient chaotic dimensionality expansion by recurrent networks. arXiv:2002.110066. Nestler, Keup, Dahmen, Gilson, Rauhut, Helias (2020) Unfolding recurrence by Green's functions for optimized reservoir computing. In Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
000890928 536__ $$0G:(DE-HGF)POF4-5231$$a5231 - Neuroscientific Foundations (POF4-523)$$cPOF4-523$$fPOF IV$$x0
000890928 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x1
000890928 536__ $$0G:(DE-HGF)POF4-5234$$a5234 - Emerging NC Architectures (POF4-523)$$cPOF4-523$$fPOF IV$$x2
000890928 536__ $$0G:(DE-Juel1)HGF-SMHB-2014-2018$$aMSNN - Theory of multi-scale neuronal networks (HGF-SMHB-2014-2018)$$cHGF-SMHB-2014-2018$$fMSNN$$x3
000890928 536__ $$0G:(EU-Grant)785907$$aHBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907)$$c785907$$fH2020-SGA-FETFLAG-HBP-2017$$x4
000890928 536__ $$0G:(EU-Grant)945539$$aHBP SGA3 - Human Brain Project Specific Grant Agreement 3 (945539)$$c945539$$x5
000890928 536__ $$0G:(DE-82)EXS-SF-neuroIC002$$aneuroIC002 - Recurrence and stochasticity for neuro-inspired computation (EXS-SF-neuroIC002)$$cEXS-SF-neuroIC002$$x6
000890928 536__ $$0G:(DE-Juel-1)BMBF-01IS19077A$$aRenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)$$cBMBF-01IS19077A$$x7
000890928 536__ $$0G:(DE-Juel1)aca_20190115$$aAdvanced Computing Architectures (aca_20190115)$$caca_20190115$$fAdvanced Computing Architectures$$x8
000890928 536__ $$0G:(DE-Juel-1)PF-JARA-SDS005$$aSDS005 - Towards an integrated data science of complex natural systems (PF-JARA-SDS005)$$cPF-JARA-SDS005$$x9
000890928 7001_ $$0P:(DE-Juel1)173607$$avan Meegen, Alexander$$b1$$ufzj
000890928 7001_ $$0P:(DE-Juel1)156459$$aDahmen, David$$b2$$ufzj
000890928 7001_ $$0P:(DE-Juel1)171384$$aKeup, Christian$$b3$$ufzj
000890928 7001_ $$0P:(DE-Juel1)174585$$aNestler, Sandra$$b4$$ufzj
000890928 909CO $$ooai:juser.fz-juelich.de:890928$$pec_fundedresources$$pVDB$$popenaire
000890928 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144806$$aForschungszentrum Jülich$$b0$$kFZJ
000890928 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)173607$$aForschungszentrum Jülich$$b1$$kFZJ
000890928 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)156459$$aForschungszentrum Jülich$$b2$$kFZJ
000890928 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)171384$$aForschungszentrum Jülich$$b3$$kFZJ
000890928 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)174585$$aForschungszentrum Jülich$$b4$$kFZJ
000890928 9130_ $$0G:(DE-HGF)POF3-571$$1G:(DE-HGF)POF3-570$$2G:(DE-HGF)POF3-500$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bKey Technologies$$lDecoding the Human Brain$$vConnectivity and Activity$$x0
000890928 9130_ $$0G:(DE-HGF)POF3-574$$1G:(DE-HGF)POF3-570$$2G:(DE-HGF)POF3-500$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bKey Technologies$$lDecoding the Human Brain$$vTheory, modelling and simulation$$x1
000890928 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5231$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
000890928 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x1
000890928 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5234$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x2
000890928 9141_ $$y2021
000890928 9201_ $$0I:(DE-Juel1)INM-6-20090406$$kINM-6$$lComputational and Systems Neuroscience$$x0
000890928 9201_ $$0I:(DE-Juel1)INM-10-20170113$$kINM-10$$lJara-Institut Brain structure-function relationships$$x1
000890928 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lTheoretical Neuroscience$$x2
000890928 980__ $$atalk
000890928 980__ $$aVDB
000890928 980__ $$aI:(DE-Juel1)INM-6-20090406
000890928 980__ $$aI:(DE-Juel1)INM-10-20170113
000890928 980__ $$aI:(DE-Juel1)IAS-6-20130828
000890928 980__ $$aUNRESTRICTED
000890928 981__ $$aI:(DE-Juel1)IAS-6-20130828