001     1047535
005     20251103202055.0
024 7 _ |a 10.34734/FZJ-2025-04365
|2 datacite_doi
037 _ _ |a FZJ-2025-04365
041 _ _ |a English
100 1 _ |a Lober, Melissa
|0 P:(DE-Juel1)190224
|b 0
|e Corresponding author
111 2 _ |a International Conference on Neuromorphic Systems
|g ICONS
|c Seattle
|d 2025-07-29 - 2025-07-31
|w USA
245 _ _ |a Unsupervised online learning of complex sequences in spiking neuronal networks
260 _ _ |c 2025
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a conferenceObject
|2 DRIVER
336 7 _ |a CONFERENCE_POSTER
|2 ORCID
336 7 _ |a Output Types/Conference Poster
|2 DataCite
336 7 _ |a Poster
|b poster
|m poster
|0 PUB:(DE-HGF)24
|s 1761945664_24730
|2 PUB:(DE-HGF)
|x After Call
502 _ _ |c RWTH Aachen
520 _ _ |a Learning and processing sequential data constitutes a universal form of computation performed by the brain. Understanding the underlying principles does not only shed light on brain function, but also guides the development of energy efficient neuromorphic computing architectures. In a previous study, we devised a spiking recurrent neural network, the spiking temporal memory (spiking TM) model, implementing this type of computation. It learns sequences in a continual, unsupervised manner by means of a local Hebbian synaptic plasticity mechanism. Context specific predictions of upcoming sequence elements are represented by dendritic action potentials. Upon successful learning, the network activity is characterized by a highly sparse and hence energy efficient code. To date, the sequence learning capabilities of the spiking TM model have only been demonstrated for relatively small sequence sets. Here, we systematically investigate the sequence learning capacity of the model by gradually increasing the sequence length and optimizing the plasticity (hyper-) parameters. We show that the spiking TM model at the scale of a few thousand neurons can successfully learn random sequences composed of several tens of elements,with the maximum sequence length exceeding the vocabulary size. After optimizing the plasticity parameters for a given sequence length, the model exhibits high prediction performance for a range of sequence lengths, without additional fine tuning.The learning duration (time to solution) scales supralinearly with the sequence length. Learning longer sequences is hence computationally demanding, and requires accelerated computing architectures.
536 _ _ |a 5231 - Neuroscientific Foundations (POF4-523)
|0 G:(DE-HGF)POF4-5231
|c POF4-523
|f POF IV
|x 0
536 _ _ |a 5232 - Computational Principles (POF4-523)
|0 G:(DE-HGF)POF4-5232
|c POF4-523
|f POF IV
|x 1
536 _ _ |a JL SMHB - Joint Lab Supercomputing and Modeling for the Human Brain (JL SMHB-2021-2027)
|0 G:(DE-Juel1)JL SMHB-2021-2027
|c JL SMHB-2021-2027
|x 2
536 _ _ |a BMFTR 03ZU2106CB - NeuroSys: Algorithm-Hardware Co-Design (Projekt C) - B (BMBF-03ZU2106CB)
|0 G:(DE-Juel1)BMBF-03ZU2106CB
|c BMBF-03ZU2106CB
|x 3
536 _ _ |a BMBF 16ME0398K - Verbundprojekt: Neuro-inspirierte Technologien der künstlichen Intelligenz für die Elektronik der Zukunft - NEUROTEC II - (BMBF-16ME0398K)
|0 G:(DE-82)BMBF-16ME0398K
|c BMBF-16ME0398K
|x 4
700 1 _ |a Bouhadjar, Younes
|0 P:(DE-Juel1)176778
|b 1
700 1 _ |a Neftci, Emre
|0 P:(DE-Juel1)188273
|b 2
700 1 _ |a Diesmann, Markus
|0 P:(DE-Juel1)144174
|b 3
700 1 _ |a Tetzlaff, Tom
|0 P:(DE-Juel1)145211
|b 4
856 4 _ |u https://juser.fz-juelich.de/record/1047535/files/ICONS__spiking_TM_poster.pdf
|y OpenAccess
909 C O |o oai:juser.fz-juelich.de:1047535
|p openaire
|p open_access
|p VDB
|p driver
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)190224
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)176778
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)188273
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 3
|6 P:(DE-Juel1)144174
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 4
|6 P:(DE-Juel1)145211
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5231
|x 0
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5232
|x 1
914 1 _ |y 2025
915 _ _ |a OpenAccess
|0 StatID:(DE-HGF)0510
|2 StatID
920 _ _ |l no
920 1 _ |0 I:(DE-Juel1)IAS-6-20130828
|k IAS-6
|l Computational and Systems Neuroscience
|x 0
920 1 _ |0 I:(DE-Juel1)PGI-15-20210701
|k PGI-15
|l Neuromorphic Software Eco System
|x 1
920 1 _ |0 I:(DE-Juel1)INM-10-20170113
|k INM-10
|l Jara-Institut Brain structure-function relationships
|x 2
980 _ _ |a poster
980 _ _ |a VDB
980 _ _ |a I:(DE-Juel1)IAS-6-20130828
980 _ _ |a I:(DE-Juel1)PGI-15-20210701
980 _ _ |a I:(DE-Juel1)INM-10-20170113
980 _ _ |a UNRESTRICTED
980 1 _ |a FullTexts


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21