TY  - CONF
AU  - Fischer, Kirsten
AU  - Lindner, Javed
AU  - Dahmen, David
AU  - Ringel, Zohar
AU  - Krämer, Michael
AU  - Helias, Moritz
TI  - Critical feature learning in deep neural networks
M1  - FZJ-2024-05061
PY  - 2024
AB  - A key property of neural networks driving their success is their ability to learn features from data. Understanding feature learning from a theoretical viewpoint is an emerging field with many open questions. In this work we capture finite-width effects with a systematic theory of network kernels in deep non-linear neural networks. We show that the Bayesian prior of the network can be written in closed form as a superposition of Gaussian processes, whose kernels are distributed with a variance that depends inversely on the network width N . A large deviation approach, which is exact in the proportional limit for the number of data points P=αN→∞, yields a pair of forward-backward equations for the maximum a posteriori kernels in all layers at once. We study their solutions perturbatively to demonstrate how the backward propagation across layers aligns kernels with the target. An alternative field-theoretic formulation shows that kernel adaptation of the Bayesian posterior at finite-width results from fluctuations in the prior: larger fluctuations correspond to a more flexible network prior and thus enable stronger adaptation to data. We thus find a bridge between the classical edge-of-chaos NNGP theory and feature learning, exposing an intricate interplay between criticality, response functions, and feature scale.
T2  - The Forty-first International Conference on Machine Learning
CY  - 21 Jul 2024 - 27 Jul 2024, Wien (Austria)
Y2  - 21 Jul 2024 - 27 Jul 2024
M2  - Wien, Austria
LB  - PUB:(DE-HGF)24
UR  - https://juser.fz-juelich.de/record/1029334
ER  -