TY - CONF
AU - Fischer, Kirsten
AU - Rene, Alexandre
AU - Keup, Christian
AU - Layer, Moritz
AU - Dahmen, David
AU - Helias, Moritz
TI - Decomposing neural networks as mappings of correlation functions
M1 - FZJ-2023-00020
PY - 2022
AB - Understanding the functional principles of information processing in deep neural networks continues to be a challenge, in particular for networks with trained and thus non-random weights. To address this issue, we study the mapping between probability distributions implemented by a deep feed-forward network. We characterize this mapping as an iterated transformation of distributions, where the non-linearity in each layer transfers information between different orders of correlation functions. This allows us to identify essential statistics in the data, as well as different information representations that can be used by neural networks. Applied to an XOR task and to MNIST, we show that correlations up to second order predominantly capture the information processing in the internal layers, while the input layer also extracts higher-order correlations from the data. This analysis provides a quantitative and explainable perspective on classification.
T2 - Quantum Information Seminnar
CY - 12 Dec 2022 - 12 Dec 2022, Aachen (Germany)
Y2 - 12 Dec 2022 - 12 Dec 2022
M2 - Aachen, Germany
LB - PUB:(DE-HGF)31
UR - https://juser.fz-juelich.de/record/916669
ER -