TY  - JOUR
AU  - Renner, Alpha
AU  - Sheldon, Forrest
AU  - Zlotnik, Anatoly
AU  - Tao, Louis
AU  - Sornborger, Andrew
TI  - The backpropagation algorithm implemented on spiking neuromorphic hardware
JO  - Nature Communications
VL  - 15
IS  - 1
SN  - 2041-1723
CY  - [London]
PB  - Nature Publishing Group UK
M1  - FZJ-2024-06157
SP  - 9691
PY  - 2024
AB  - The capabilities of natural neural systems have inspired both new generations of machine learning algorithms as well as neuromorphic, very large-scale integrated circuits capable of fast, low-power information processing. However, it has been argued that most modern machine learning algorithms are not neurophysiologically plausible. In particular, the workhorse of modern deep learning, the backpropagation algorithm, has proven difficult to translate to neuromorphic hardware. This study presents a neuromorphic, spiking backpropagation algorithm based on synfire-gated dynamical information coordination and processing implemented on Intel’s Loihi neuromorphic research processor. We demonstrate a proof-of-principle three-layer circuit that learns to classify digits and clothing items from the MNIST and Fashion MNIST datasets. To our knowledge, this is the first work to show a Spiking Neural Network implementation of the exact backpropagation algorithm that is fully on-chip without a computer in the loop. It is competitive in accuracy with off-chip trained SNNs and achieves an energy-delay product suitable for edge computing. This implementation shows a path for using in-memory, massively parallel neuromorphic processors for low-power, low-latency implementation of modern deep learning applications.
LB  - PUB:(DE-HGF)16
C6  - 39516210
UR  - <Go to ISI:>//WOS:001352395400002
DO  - DOI:10.1038/s41467-024-53827-9
UR  - https://juser.fz-juelich.de/record/1032331
ER  -