000902308 001__ 902308
000902308 005__ 20240313103124.0
000902308 0247_ $$2Handle$$a2128/29260
000902308 037__ $$aFZJ-2021-04170
000902308 041__ $$aEnglish
000902308 1001_ $$0P:(DE-Juel1)176778$$aBouhadjar, Younes$$b0$$eCorresponding author
000902308 245__ $$aSequence learning, prediction, and replay in networks of spiking neurons
000902308 260__ $$barXiv$$c2021
000902308 3367_ $$0PUB:(DE-HGF)25$$2PUB:(DE-HGF)$$aPreprint$$bpreprint$$mpreprint$$s1638366433_8351
000902308 3367_ $$2ORCID$$aWORKING_PAPER
000902308 3367_ $$028$$2EndNote$$aElectronic Article
000902308 3367_ $$2DRIVER$$apreprint
000902308 3367_ $$2BibTeX$$aARTICLE
000902308 3367_ $$2DataCite$$aOutput Types/Working Paper
000902308 520__ $$aSequence learning, prediction and replay have been proposed to constitute the universal computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes these forms of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, permits a context specific prediction of future sequence elements, and generates mismatch signals in case the predictions are not met. While the HTM algorithm accounts for a number of biological features such as topographic receptive fields, nonlinear dendritic processing, and sparse connectivity, it is based on abstract discrete-time neuron and synapse dynamics, as well as on plasticity mechanisms that can only partly be related to known biological mechanisms.Here, we devise a continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters. The model learns high-order sequences by means of a structural Hebbian synaptic plasticity mechanism supplemented with a rate-based homeostatic control. In combination with nonlinear dendritic input integration and local inhibitory feedback, this type of plasticity leads to the dynamic self-organization of narrow sequence-specific feedforward subnetworks. These subnetworks provide the substrate for a faithful propagation of sparse, synchronous activity, and, thereby, for a robust, context specific prediction of future sequence elements as well as for the autonomous replay of previously learned sequences.By strengthening the link to biology, our implementation facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities. The continuous-time implementation of the TM algorithm permits, in particular, an investigation of the role of sequence timing for sequence learning, prediction and replay. We demonstrate this aspect by studying the effect of the sequence speed on the sequence learning performance and on the speed of autonomous sequence replay.
000902308 536__ $$0G:(DE-HGF)POF3-574$$a574 - Theory, modelling and simulation (POF3-574)$$cPOF3-574$$fPOF III$$x0
000902308 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x1
000902308 536__ $$0G:(DE-Juel1)aca_20190115$$aAdvanced Computing Architectures (aca_20190115)$$caca_20190115$$fAdvanced Computing Architectures$$x2
000902308 536__ $$0G:(EU-Grant)945539$$aHBP SGA3 - Human Brain Project Specific Grant Agreement 3 (945539)$$c945539$$fH2020-SGA-FETFLAG-HBP-2019$$x3
000902308 536__ $$0G:(EU-Grant)785907$$aHBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907)$$c785907$$fH2020-SGA-FETFLAG-HBP-2017$$x4
000902308 7001_ $$0P:(DE-HGF)0$$aWouters, Dirk J.$$b1
000902308 7001_ $$0P:(DE-Juel1)144174$$aDiesmann, Markus$$b2
000902308 7001_ $$0P:(DE-Juel1)145211$$aTetzlaff, Tom$$b3
000902308 8564_ $$uhttps://arxiv.org/pdf/2111.03456.pdf
000902308 8564_ $$uhttps://juser.fz-juelich.de/record/902308/files/2111.03456.pdf$$yOpenAccess
000902308 909CO $$ooai:juser.fz-juelich.de:902308$$pdnbdelivery$$pec_fundedresources$$pVDB$$pdriver$$popen_access$$popenaire
000902308 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)176778$$aForschungszentrum Jülich$$b0$$kFZJ
000902308 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144174$$aForschungszentrum Jülich$$b2$$kFZJ
000902308 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)145211$$aForschungszentrum Jülich$$b3$$kFZJ
000902308 9130_ $$0G:(DE-HGF)POF3-574$$1G:(DE-HGF)POF3-570$$2G:(DE-HGF)POF3-500$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bKey Technologies$$lDecoding the Human Brain$$vTheory, modelling and simulation$$x0
000902308 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
000902308 9141_ $$y2021
000902308 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
000902308 920__ $$lyes
000902308 9201_ $$0I:(DE-Juel1)INM-6-20090406$$kINM-6$$lComputational and Systems Neuroscience$$x0
000902308 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lTheoretical Neuroscience$$x1
000902308 9201_ $$0I:(DE-Juel1)INM-10-20170113$$kINM-10$$lJara-Institut Brain structure-function relationships$$x2
000902308 9201_ $$0I:(DE-Juel1)PGI-7-20110106$$kPGI-7$$lElektronische Materialien$$x3
000902308 9201_ $$0I:(DE-Juel1)PGI-10-20170113$$kPGI-10$$lJARA Institut Green IT$$x4
000902308 9801_ $$aFullTexts
000902308 980__ $$apreprint
000902308 980__ $$aVDB
000902308 980__ $$aUNRESTRICTED
000902308 980__ $$aI:(DE-Juel1)INM-6-20090406
000902308 980__ $$aI:(DE-Juel1)IAS-6-20130828
000902308 980__ $$aI:(DE-Juel1)INM-10-20170113
000902308 980__ $$aI:(DE-Juel1)PGI-7-20110106
000902308 980__ $$aI:(DE-Juel1)PGI-10-20170113
000902308 981__ $$aI:(DE-Juel1)IAS-6-20130828