Poster (After Call) FZJ-2022-03558

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Statistical decomposition of neural networks: Information transfer between correlation functions

 ;  ;  ;  ;  ;

2022

Bernstein conference, BerlinBerlin, Germany, 14 Sep 2022 - 16 Sep 20222022-09-142022-09-16

Abstract: Uncovering principles of information processing in neural systems continues to be an active field of research. For the visual system it is well known that it processes signals in a hierarchical manner [1,2]. Commonly used models in machine learning that perform hierarchical computations are feed-forward networks. We here study deep feed-forward networks with the aim of deducing general functional aspects of such systems. These networks implement a mapping between probability distributions, where the probability distribution is iteratively transformed from layer to layer. We develop a formalism for expressing signal transformations in each layer as transfers of information between different orders of correlation functions (see Fig. (a)). We show that the processing within internal network layers is captured by correlations up to second order. In addition, we demonstrate how the input layer also extracts higher order correlations from the data. Thus, by presenting different correlation orders in the input, we identify key statistics in the data (see Fig. (b)-(d)). As a next step, we consider recurrent time-continuous networks, reminiscent of biological neuronal networks (NeuralODEs, [3]). We derive a Fokker-Planck equation describing the evolution of the probability distribution. This formulation allows us to study time-dependent information flow between different interaction terms. In summary, this work provides insights into functional principles of information processing in neural networks.References:[1] Hubel, D. H., & Wiesel, T. N. (1962). Receptive fields, binocular interaction and functional architecture in the cat's visual cortex. The Journal of physiology, 160(1), 106.[2] Zhuang, C., Yan, S., Nayebi, A., Schrimpf, M., Frank, M. C., DiCarlo, J. J., & Yamins, D. L. (2021). Unsupervised neural network models of the ventral visual stream. Proceedings of the National Academy of Sciences, 118(3), e2014196118.[3] Chen, R. T., Rubanova, Y., Bettencourt, J., & Duvenaud, D. K. (2018). Neural ordinary differential equations. Advances in neural information processing systems, 31.


Contributing Institute(s):
  1. Computational and Systems Neuroscience (INM-6)
  2. Theoretical Neuroscience (IAS-6)
  3. Jara-Institut Brain structure-function relationships (INM-10)
Research Program(s):
  1. 5232 - Computational Principles (POF4-523) (POF4-523)
  2. 5234 - Emerging NC Architectures (POF4-523) (POF4-523)
  3. RenormalizedFlows - Transparent Deep Learning with Renormalized Flows (BMBF-01IS19077A) (BMBF-01IS19077A)
  4. MSNN - Theory of multi-scale neuronal networks (HGF-SMHB-2014-2018) (HGF-SMHB-2014-2018)
  5. ACA - Advanced Computing Architectures (SO-092) (SO-092)
  6. neuroIC002 - Recurrence and stochasticity for neuro-inspired computation (EXS-SF-neuroIC002) (EXS-SF-neuroIC002)
  7. GRK 2416:  MultiSenses-MultiScales: Novel approaches to decipher neural processing in multisensory integration (368482240) (368482240)

Appears in the scientific report 2022
Click to display QR Code for this record

The record appears in these collections:
Institute Collections > INM > INM-10
Document types > Presentations > Poster
Institute Collections > IAS > IAS-6
Institute Collections > INM > INM-6
Workflow collections > Public records
Publications database

 Record created 2022-09-29, last modified 2024-03-13



Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)