001     902308
005     20240313103124.0
024 7 _ |a 2128/29260
|2 Handle
037 _ _ |a FZJ-2021-04170
041 _ _ |a English
100 1 _ |a Bouhadjar, Younes
|0 P:(DE-Juel1)176778
|b 0
|e Corresponding author
245 _ _ |a Sequence learning, prediction, and replay in networks of spiking neurons
260 _ _ |c 2021
|b arXiv
336 7 _ |a Preprint
|b preprint
|m preprint
|0 PUB:(DE-HGF)25
|s 1638366433_8351
|2 PUB:(DE-HGF)
336 7 _ |a WORKING_PAPER
|2 ORCID
336 7 _ |a Electronic Article
|0 28
|2 EndNote
336 7 _ |a preprint
|2 DRIVER
336 7 _ |a ARTICLE
|2 BibTeX
336 7 _ |a Output Types/Working Paper
|2 DataCite
520 _ _ |a Sequence learning, prediction and replay have been proposed to constitute the universal computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes these forms of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, permits a context specific prediction of future sequence elements, and generates mismatch signals in case the predictions are not met. While the HTM algorithm accounts for a number of biological features such as topographic receptive fields, nonlinear dendritic processing, and sparse connectivity, it is based on abstract discrete-time neuron and synapse dynamics, as well as on plasticity mechanisms that can only partly be related to known biological mechanisms.Here, we devise a continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters. The model learns high-order sequences by means of a structural Hebbian synaptic plasticity mechanism supplemented with a rate-based homeostatic control. In combination with nonlinear dendritic input integration and local inhibitory feedback, this type of plasticity leads to the dynamic self-organization of narrow sequence-specific feedforward subnetworks. These subnetworks provide the substrate for a faithful propagation of sparse, synchronous activity, and, thereby, for a robust, context specific prediction of future sequence elements as well as for the autonomous replay of previously learned sequences.By strengthening the link to biology, our implementation facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities. The continuous-time implementation of the TM algorithm permits, in particular, an investigation of the role of sequence timing for sequence learning, prediction and replay. We demonstrate this aspect by studying the effect of the sequence speed on the sequence learning performance and on the speed of autonomous sequence replay.
536 _ _ |a 574 - Theory, modelling and simulation (POF3-574)
|0 G:(DE-HGF)POF3-574
|c POF3-574
|f POF III
|x 0
536 _ _ |a 5232 - Computational Principles (POF4-523)
|0 G:(DE-HGF)POF4-5232
|c POF4-523
|f POF IV
|x 1
536 _ _ |a Advanced Computing Architectures (aca_20190115)
|0 G:(DE-Juel1)aca_20190115
|c aca_20190115
|f Advanced Computing Architectures
|x 2
536 _ _ |a HBP SGA3 - Human Brain Project Specific Grant Agreement 3 (945539)
|0 G:(EU-Grant)945539
|c 945539
|f H2020-SGA-FETFLAG-HBP-2019
|x 3
536 _ _ |a HBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907)
|0 G:(EU-Grant)785907
|c 785907
|f H2020-SGA-FETFLAG-HBP-2017
|x 4
700 1 _ |a Wouters, Dirk J.
|0 P:(DE-HGF)0
|b 1
700 1 _ |a Diesmann, Markus
|0 P:(DE-Juel1)144174
|b 2
700 1 _ |a Tetzlaff, Tom
|0 P:(DE-Juel1)145211
|b 3
856 4 _ |u https://arxiv.org/pdf/2111.03456.pdf
856 4 _ |u https://juser.fz-juelich.de/record/902308/files/2111.03456.pdf
|y OpenAccess
909 C O |o oai:juser.fz-juelich.de:902308
|p openaire
|p open_access
|p driver
|p VDB
|p ec_fundedresources
|p dnbdelivery
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)176778
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)144174
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 3
|6 P:(DE-Juel1)145211
913 0 _ |a DE-HGF
|b Key Technologies
|l Decoding the Human Brain
|1 G:(DE-HGF)POF3-570
|0 G:(DE-HGF)POF3-574
|3 G:(DE-HGF)POF3
|2 G:(DE-HGF)POF3-500
|4 G:(DE-HGF)POF
|v Theory, modelling and simulation
|x 0
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5232
|x 0
914 1 _ |y 2021
915 _ _ |a OpenAccess
|0 StatID:(DE-HGF)0510
|2 StatID
920 _ _ |l yes
920 1 _ |0 I:(DE-Juel1)INM-6-20090406
|k INM-6
|l Computational and Systems Neuroscience
|x 0
920 1 _ |0 I:(DE-Juel1)IAS-6-20130828
|k IAS-6
|l Theoretical Neuroscience
|x 1
920 1 _ |0 I:(DE-Juel1)INM-10-20170113
|k INM-10
|l Jara-Institut Brain structure-function relationships
|x 2
920 1 _ |0 I:(DE-Juel1)PGI-7-20110106
|k PGI-7
|l Elektronische Materialien
|x 3
920 1 _ |0 I:(DE-Juel1)PGI-10-20170113
|k PGI-10
|l JARA Institut Green IT
|x 4
980 1 _ |a FullTexts
980 _ _ |a preprint
980 _ _ |a VDB
980 _ _ |a UNRESTRICTED
980 _ _ |a I:(DE-Juel1)INM-6-20090406
980 _ _ |a I:(DE-Juel1)IAS-6-20130828
980 _ _ |a I:(DE-Juel1)INM-10-20170113
980 _ _ |a I:(DE-Juel1)PGI-7-20110106
980 _ _ |a I:(DE-Juel1)PGI-10-20170113
981 _ _ |a I:(DE-Juel1)IAS-6-20130828


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21