Poster (After Call) FZJ-2025-04365

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Unsupervised online learning of complex sequences in spiking neuronal networks

 ;  ;  ;  ;

2025

International Conference on Neuromorphic Systems, ICONS, RWTH AachenSeattle, RWTH Aachen, USA, 29 Jul 2025 - 31 Jul 20252025-07-292025-07-31 [10.34734/FZJ-2025-04365]

This record in other databases:

Please use a persistent id in citations: doi:

Abstract: Learning and processing sequential data constitutes a universal form of computation performed by the brain. Understanding the underlying principles does not only shed light on brain function, but also guides the development of energy efficient neuromorphic computing architectures. In a previous study, we devised a spiking recurrent neural network, the spiking temporal memory (spiking TM) model, implementing this type of computation. It learns sequences in a continual, unsupervised manner by means of a local Hebbian synaptic plasticity mechanism. Context specific predictions of upcoming sequence elements are represented by dendritic action potentials. Upon successful learning, the network activity is characterized by a highly sparse and hence energy efficient code. To date, the sequence learning capabilities of the spiking TM model have only been demonstrated for relatively small sequence sets. Here, we systematically investigate the sequence learning capacity of the model by gradually increasing the sequence length and optimizing the plasticity (hyper-) parameters. We show that the spiking TM model at the scale of a few thousand neurons can successfully learn random sequences composed of several tens of elements,with the maximum sequence length exceeding the vocabulary size. After optimizing the plasticity parameters for a given sequence length, the model exhibits high prediction performance for a range of sequence lengths, without additional fine tuning.The learning duration (time to solution) scales supralinearly with the sequence length. Learning longer sequences is hence computationally demanding, and requires accelerated computing architectures.


Contributing Institute(s):
  1. Computational and Systems Neuroscience (IAS-6)
  2. Neuromorphic Software Eco System (PGI-15)
  3. Jara-Institut Brain structure-function relationships (INM-10)
Research Program(s):
  1. 5231 - Neuroscientific Foundations (POF4-523) (POF4-523)
  2. 5232 - Computational Principles (POF4-523) (POF4-523)
  3. JL SMHB - Joint Lab Supercomputing and Modeling for the Human Brain (JL SMHB-2021-2027) (JL SMHB-2021-2027)
  4. BMFTR 03ZU2106CB - NeuroSys: Algorithm-Hardware Co-Design (Projekt C) - B (BMBF-03ZU2106CB) (BMBF-03ZU2106CB)
  5. BMBF 16ME0398K - Verbundprojekt: Neuro-inspirierte Technologien der künstlichen Intelligenz für die Elektronik der Zukunft - NEUROTEC II - (BMBF-16ME0398K) (BMBF-16ME0398K)

Appears in the scientific report 2025
Database coverage:
OpenAccess
Click to display QR Code for this record

The record appears in these collections:
Institute Collections > INM > INM-10
Document types > Presentations > Poster
Institute Collections > IAS > IAS-6
Institute Collections > PGI > PGI-15
Workflow collections > Public records
Publications database
Open Access

 Record created 2025-10-31, last modified 2025-11-03


OpenAccess:
Download fulltext PDF
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)