000910329 001__ 910329
000910329 005__ 20240313094953.0
000910329 037__ $$aFZJ-2022-03755
000910329 041__ $$aEnglish
000910329 1001_ $$0P:(DE-Juel1)180150$$aFischer, Kirsten$$b0$$eCorresponding author
000910329 1112_ $$aINM IBI Retreat 2022$$cJuelich$$d2022-10-18 - 2022-10-19$$wGermany
000910329 245__ $$aStatistical decomposition of feed-forward neural networks: Transfer of information between correlation functions
000910329 260__ $$c2022
000910329 3367_ $$033$$2EndNote$$aConference Paper
000910329 3367_ $$2BibTeX$$aINPROCEEDINGS
000910329 3367_ $$2DRIVER$$aconferenceObject
000910329 3367_ $$2ORCID$$aCONFERENCE_POSTER
000910329 3367_ $$2DataCite$$aOutput Types/Conference Poster
000910329 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1669804429_7067$$xAfter Call
000910329 520__ $$aUncovering principles of information processing in neural systems continues to be an active field of research. For the visual system it is well known that it processes signals in a hierarchical manner [1,2]. Feed-forward networks are commonly used models in machine learning that perform hierarchical computations. We here study deep feed-forward networks with the aim of deducing general functional aspects of such systems. These networks implement mappings between probability distributions, where the probability distribution are iteratively transformed from layer to layer. We develop a formalism for expressing signal transformations in each layer as information transfers between different orders of correlation functions. We show that the processing within internal network layers is captured by correlations up to second order. In addition, we demonstrate how the input layer also extracts higher order correlations from the data. Thus, by presenting different correlation orders in the input, we identify key statistics in the data. As a next step, we consider recurrent time-continuous networks, reminiscent of biological neuronal networks (NeuralODEs, [3]). We derive a Fokker-Planck equation describing the evolution of the probability distribution. This formulation allows us to study time-dependent information flow between different interaction terms. In summary, this work provides insights into functional principles of information processing in neural networks.References[1] Hubel, D. H., & Wiesel, T. N. (1962). Receptive fields, binocular interaction and functional architecture in the cat's visual cortex. The Journal of physiology, 160(1), 106.[2] Zhuang, C., Yan, S., Nayebi, A., Schrimpf, M., Frank, M. C., DiCarlo, J. J., & Yamins, D. L. (2021). Unsupervised neural network models of the ventral visual stream. Proceedings of the National Academy of Sciences, 118(3), e2014196118.[3] Chen, R. T., Rubanova, Y., Bettencourt, J., & Duvenaud, D. K. (2018). Neural ordinary differential equations. Advances in neural information processing systems, 31.
000910329 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x0
000910329 536__ $$0G:(DE-HGF)POF4-5234$$a5234 - Emerging NC Architectures (POF4-523)$$cPOF4-523$$fPOF IV$$x1
000910329 536__ $$0G:(DE-Juel-1)BMBF-01IS19077A$$aRenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A)$$cBMBF-01IS19077A$$x2
000910329 536__ $$0G:(DE-Juel1)HGF-SMHB-2014-2018$$aMSNN - Theory of multi-scale neuronal networks (HGF-SMHB-2014-2018)$$cHGF-SMHB-2014-2018$$fMSNN$$x3
000910329 536__ $$0G:(DE-HGF)SO-092$$aACA - Advanced Computing Architectures (SO-092)$$cSO-092$$x4
000910329 536__ $$0G:(DE-82)EXS-SF-neuroIC002$$aneuroIC002 - Recurrence and stochasticity for neuro-inspired computation (EXS-SF-neuroIC002)$$cEXS-SF-neuroIC002$$x5
000910329 7001_ $$0P:(DE-Juel1)178936$$aRene, Alexandre$$b1
000910329 7001_ $$0P:(DE-Juel1)171384$$aKeup, Christian$$b2
000910329 7001_ $$0P:(DE-Juel1)174497$$aLayer, Moritz$$b3
000910329 7001_ $$0P:(DE-Juel1)156459$$aDahmen, David$$b4
000910329 7001_ $$0P:(DE-Juel1)144806$$aHelias, Moritz$$b5
000910329 909CO $$ooai:juser.fz-juelich.de:910329$$pVDB
000910329 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)180150$$aForschungszentrum Jülich$$b0$$kFZJ
000910329 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)178936$$aForschungszentrum Jülich$$b1$$kFZJ
000910329 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)171384$$aForschungszentrum Jülich$$b2$$kFZJ
000910329 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)174497$$aForschungszentrum Jülich$$b3$$kFZJ
000910329 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)156459$$aForschungszentrum Jülich$$b4$$kFZJ
000910329 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144806$$aForschungszentrum Jülich$$b5$$kFZJ
000910329 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
000910329 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5234$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x1
000910329 9141_ $$y2022
000910329 920__ $$lyes
000910329 9201_ $$0I:(DE-Juel1)INM-6-20090406$$kINM-6$$lComputational and Systems Neuroscience$$x0
000910329 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lTheoretical Neuroscience$$x1
000910329 9201_ $$0I:(DE-Juel1)INM-10-20170113$$kINM-10$$lJara-Institut Brain structure-function relationships$$x2
000910329 980__ $$aposter
000910329 980__ $$aVDB
000910329 980__ $$aI:(DE-Juel1)INM-6-20090406
000910329 980__ $$aI:(DE-Juel1)IAS-6-20130828
000910329 980__ $$aI:(DE-Juel1)INM-10-20170113
000910329 980__ $$aUNRESTRICTED
000910329 981__ $$aI:(DE-Juel1)IAS-6-20130828