Conference Presentation (After Call) FZJ-2026-00967

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Feature learning in deep neural networks close to criticality

 ;  ;  ;  ;  ;

2025

DPG Spring Meeting of the Condensed Matter Section, RegensburgRegensburg, Germany, 16 Mar 2025 - 21 Mar 20252025-03-162025-03-21

Abstract: Neural networks excel due to their ability to learn features, yet its theoretical understanding continues to be a field of ongoing research. We develop a finite-width theory for deep non-linear networks, showing that their Bayesian prior is a superposition of Gaussian processes with kernel variances inversely proportional to the network width. In the proportional limit where both network width and training samples scale as N,P→∞ with P/N fixed, we derive forward-backward equations for the maximum a posteriori kernels, demonstrating how layer representations align with targets across network layers. A field-theoretic approach links finite-width corrections of the network kernels to fluctuations of the prior, bridging classical edge-of-chaos theory with feature learning and revealing key interactions between criticality, response, and network scales.


Contributing Institute(s):
  1. Computational and Systems Neuroscience (IAS-6)
Research Program(s):
  1. 5232 - Computational Principles (POF4-523) (POF4-523)
  2. 5234 - Emerging NC Architectures (POF4-523) (POF4-523)
  3. MSNN - Theory of multi-scale neuronal networks (HGF-SMHB-2014-2018) (HGF-SMHB-2014-2018)
  4. ACA - Advanced Computing Architectures (SO-092) (SO-092)
  5. GRK 2416 - GRK 2416: MultiSenses-MultiScales: Neue Ansätze zur Aufklärung neuronaler multisensorischer Integration (368482240) (368482240)

Click to display QR Code for this record

The record appears in these collections:
Document types > Presentations > Conference Presentations
Institute Collections > IAS > IAS-6
Workflow collections > Public records
Publications database

 Record created 2026-01-23, last modified 2026-01-26


External link:
Download fulltext
Fulltext
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)