| Home > Publications database > Neural Networks as Sources of uncorrelated Noise for functional neural Systems |
| Abstract | FZJ-2014-04643 |
; ; ; ; ; ; ; ; ; ;
2014
Abstract: Neural-network models of brain function often rely on the presence of noise [1,2]. Yet thebiological origin of this noise remains unclear. In computer simulations and in neuromorphichardware [3,4], the number of noise sources (random-number generators) is limited. In conse-quence, neurons in large functional network models have to share noise sources and are thereforecorrelated. It is largely unclear how shared-noise correlation affect the performance of func-tional network models. Further, so far there is no solution to the problem of how a limitednumber of noise sources can supply a large number of functional units with uncorrelated noise.Here, we first demonstrate that the performance of two functional network models, attrac-tor networks [5] and neural Boltzmann machines [2], is substantially impaired by shared-noisecorrelations resulting from a limited number of noise sources. Secondly, we show that thisproblem can be overcome by replacing the finite pool of independent noise sources by a (finite)recurrent neural network. As shown recently, inhibitory feedback, abundant in biological neu-ral networks, serves as a powerful decorrelation mechanism [6,7]. Shared-noise correlations areactively suppressed by the network dynamics. By exploiting this effect, the network perfor-mance is significantly improved. Finally, we demonstrate the decorrelating effect of inhibitoryfeedback in a heterogeneous network implemented in an analog neuromorphic substrate [8]. Insummary, we show that recurrent neural networks can serve as natural finite-size noise sourcesfor functional neural networks, both in biological and in synthetic neuromorphic substrates.
|
The record appears in these collections: |