Home > Publications database > Transient chaotic dimensionality expansion by recurrent networks |
Talk (non-conference) (Invited) | FZJ-2021-00212 |
2020
Abstract: Transient chaotic dimensionality expansion by recurrent networksMoritz HeliasINM-6, Juelich Research CentreFaculty of Physics, RWTH Aachen UniversityCortical neurons communicate with spikes, which are discrete events intime and value. They often show optimal computational performance close toa transition to rate-chaos; chaos that is driven by local and smooth averagesof the discrete activity.We here analyze microscopic and rate chaos in discretely-coupled networksof binary neurons by a model-independent field theory. We find a stronglynetwork size-dependent transition to microscopic chaos and a chaoticsubmanifold that spans only a finite fraction of the entire activity space.Rate chaos is shown to be impossible in these networks.Applying stimuli to a strongly microscopically chaotic binary networkthat acts as a reservoir, one observes a transient expansion of thedimensionality of the representing neuronal space. Crucially, the numberof dimensions corrupted by noise lags behind the informative dimensions.This translates to a transient peak in the networks' classification performanceeven deeply in the chaotic regime, extending the view that computationalperformance is always optimal near the edge of chaos. Classificationperformance peaks rapidly within one activation per neuron, demonstratingfast event-based computation. The generality of this mechanism isunderlined by simulations of spiking networks of leaky integrate-and fireneurons.1. Keup, Kuehn, Dahmen, Helias (2020) Transient chaotic dimensionality expansion by recurrent networks. arXiv:2002.11006 [cond-mat.dis-nn]
![]() |
The record appears in these collections: |