001047535 001__ 1047535
001047535 005__ 20251103202055.0
001047535 0247_ $$2datacite_doi$$a10.34734/FZJ-2025-04365
001047535 037__ $$aFZJ-2025-04365
001047535 041__ $$aEnglish
001047535 1001_ $$0P:(DE-Juel1)190224$$aLober, Melissa$$b0$$eCorresponding author
001047535 1112_ $$aInternational Conference on Neuromorphic Systems$$cSeattle$$d2025-07-29 - 2025-07-31$$gICONS$$wUSA
001047535 245__ $$aUnsupervised online learning of complex sequences in spiking neuronal networks
001047535 260__ $$c2025
001047535 3367_ $$033$$2EndNote$$aConference Paper
001047535 3367_ $$2BibTeX$$aINPROCEEDINGS
001047535 3367_ $$2DRIVER$$aconferenceObject
001047535 3367_ $$2ORCID$$aCONFERENCE_POSTER
001047535 3367_ $$2DataCite$$aOutput Types/Conference Poster
001047535 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1761945664_24730$$xAfter Call
001047535 502__ $$cRWTH Aachen
001047535 520__ $$aLearning and processing sequential data constitutes a universal form of computation performed by the brain. Understanding the underlying principles does not only shed light on brain function, but also guides the development of energy efficient neuromorphic computing architectures. In a previous study, we devised a spiking recurrent neural network, the spiking temporal memory (spiking TM) model, implementing this type of computation. It learns sequences in a continual, unsupervised manner by means of a local Hebbian synaptic plasticity mechanism. Context specific predictions of upcoming sequence elements are represented by dendritic action potentials. Upon successful learning, the network activity is characterized by a highly sparse and hence energy efficient code. To date, the sequence learning capabilities of the spiking TM model have only been demonstrated for relatively small sequence sets. Here, we systematically investigate the sequence learning capacity of the model by gradually increasing the sequence length and optimizing the plasticity (hyper-) parameters. We show that the spiking TM model at the scale of a few thousand neurons can successfully learn random sequences composed of several tens of elements,with the maximum sequence length exceeding the vocabulary size. After optimizing the plasticity parameters for a given sequence length, the model exhibits high prediction performance for a range of sequence lengths, without additional fine tuning.The learning duration (time to solution) scales supralinearly with the sequence length. Learning longer sequences is hence computationally demanding, and requires accelerated computing architectures.
001047535 536__ $$0G:(DE-HGF)POF4-5231$$a5231 - Neuroscientific Foundations (POF4-523)$$cPOF4-523$$fPOF IV$$x0
001047535 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x1
001047535 536__ $$0G:(DE-Juel1)JL SMHB-2021-2027$$aJL SMHB - Joint Lab Supercomputing and Modeling for the Human Brain (JL SMHB-2021-2027)$$cJL SMHB-2021-2027$$x2
001047535 536__ $$0G:(DE-Juel1)BMBF-03ZU2106CB$$aBMFTR 03ZU2106CB - NeuroSys: Algorithm-Hardware Co-Design (Projekt C) - B (BMBF-03ZU2106CB)$$cBMBF-03ZU2106CB$$x3
001047535 536__ $$0G:(DE-82)BMBF-16ME0398K$$aBMBF 16ME0398K - Verbundprojekt: Neuro-inspirierte Technologien der künstlichen Intelligenz für die Elektronik der Zukunft - NEUROTEC II - (BMBF-16ME0398K)$$cBMBF-16ME0398K$$x4
001047535 7001_ $$0P:(DE-Juel1)176778$$aBouhadjar, Younes$$b1
001047535 7001_ $$0P:(DE-Juel1)188273$$aNeftci, Emre$$b2
001047535 7001_ $$0P:(DE-Juel1)144174$$aDiesmann, Markus$$b3
001047535 7001_ $$0P:(DE-Juel1)145211$$aTetzlaff, Tom$$b4
001047535 8564_ $$uhttps://juser.fz-juelich.de/record/1047535/files/ICONS__spiking_TM_poster.pdf$$yOpenAccess
001047535 909CO $$ooai:juser.fz-juelich.de:1047535$$popenaire$$popen_access$$pVDB$$pdriver
001047535 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)190224$$aForschungszentrum Jülich$$b0$$kFZJ
001047535 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)176778$$aForschungszentrum Jülich$$b1$$kFZJ
001047535 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)188273$$aForschungszentrum Jülich$$b2$$kFZJ
001047535 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144174$$aForschungszentrum Jülich$$b3$$kFZJ
001047535 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)145211$$aForschungszentrum Jülich$$b4$$kFZJ
001047535 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5231$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
001047535 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x1
001047535 9141_ $$y2025
001047535 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
001047535 920__ $$lno
001047535 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lComputational and Systems Neuroscience$$x0
001047535 9201_ $$0I:(DE-Juel1)PGI-15-20210701$$kPGI-15$$lNeuromorphic Software Eco System$$x1
001047535 9201_ $$0I:(DE-Juel1)INM-10-20170113$$kINM-10$$lJara-Institut Brain structure-function relationships$$x2
001047535 980__ $$aposter
001047535 980__ $$aVDB
001047535 980__ $$aI:(DE-Juel1)IAS-6-20130828
001047535 980__ $$aI:(DE-Juel1)PGI-15-20210701
001047535 980__ $$aI:(DE-Juel1)INM-10-20170113
001047535 980__ $$aUNRESTRICTED
001047535 9801_ $$aFullTexts