TY - CONF
AU - Nestler, Sandra
AU - Keup, Christian
AU - Dahmen, David
AU - Gilson, Matthieu
AU - Rauhut, Holger
AU - Helias, Moritz
TI - Unfolding recurrence by Green’s functions for optimized reservoir computing
M1 - FZJ-2021-00223
SP - 1
PY - 2020
AB - Cortical networks are strongly recurrent, and neurons have intrinsic temporaldynamics. This sets them apart from deep feed-forward networks. Despite thetremendous progress in the application of feed-forward networks and their the-oretical understanding, it remains unclear how the interplay of recurrence andnon-linearities in recurrent cortical networks contributes to their function. Thepurpose of this work is to present a solvable recurrent network model that links tofeed forward networks. By perturbative methods we transform the time-continuous,recurrent dynamics into an effective feed-forward structure of linear and non-lineartemporal kernels. The resulting analytical expressions allow us to build optimaltime-series classifiers from random reservoir networks. Firstly, this allows us tooptimize not only the readout vectors, but also the input projection, demonstratinga strong potential performance gain. Secondly, the analysis exposes how the secondorder stimulus statistics is a crucial element that interacts with the non-linearity ofthe dynamics and boosts performance.
T2 - 34th Conference on Neural Information Processing Systems
CY - 6 Dec 2020 - 12 Dec 2020, online (online)
Y2 - 6 Dec 2020 - 12 Dec 2020
M2 - online, online
LB - PUB:(DE-HGF)8
UR - https://juser.fz-juelich.de/record/889332
ER -