Preprint FZJ-2025-04262

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
A flexible framework for structural plasticity in GPU-accelerated sparse spiking neural networks

 ;  ;

2025
arXiv

arXiv () [10.48550/arXiv.2510.19764]

This record in other databases:  

Please use a persistent id in citations: doi:

Abstract: The majority of research in both training Artificial Neural Networks (ANNs) and modeling learning in biological brains focuses on synaptic plasticity, where learning equates to changing the strength of existing connections. However, in biological brains, structural plasticity - where new connections are created and others removed - is also vital, not only for effective learning but also for recovery from damage and optimal resource usage. Inspired by structural plasticity, pruning is often used in machine learning to remove weak connections from trained models to reduce the computational requirements of inference. However, the machine learning frameworks typically used for backpropagation-based training of both ANNs and Spiking Neural Networks (SNNs) are optimized for dense connectivity, meaning that pruning does not help reduce the training costs of ever-larger models. The GeNN simulator already supports efficient GPU-accelerated simulation of sparse SNNs for computational neuroscience and machine learning. Here, we present a new flexible framework for implementing GPU-accelerated structural plasticity rules and demonstrate this first using the e-prop supervised learning rule and DEEP R to train efficient, sparse SNN classifiers and then, in an unsupervised learning context, to learn topographic maps. Compared to baseline dense models, our sparse classifiers reduce training time by up to 10x while the DEEP R rewiring enables them to perform as well as the original models. We demonstrate topographic map formation in faster-than-realtime simulations, provide insights into the connectivity evolution, and measure simulation speed versus network size. The proposed framework will enable further research into achieving and maintaining sparsity in network structure and neural communication, as well as exploring the computational benefits of sparsity in a range of neuromorphic applications.

Keyword(s): Neural and Evolutionary Computing (cs.NE) ; Neurons and Cognition (q-bio.NC) ; FOS: Computer and information sciences ; FOS: Biological sciences


Contributing Institute(s):
  1. Computational and Systems Neuroscience (IAS-6)
Research Program(s):
  1. 5232 - Computational Principles (POF4-523) (POF4-523)
  2. 5234 - Emerging NC Architectures (POF4-523) (POF4-523)
  3. HBP SGA3 - Human Brain Project Specific Grant Agreement 3 (945539) (945539)
  4. EBRAINS 2.0 - EBRAINS 2.0: A Research Infrastructure to Advance Neuroscience and Brain Health (101147319) (101147319)

Appears in the scientific report 2025
Click to display QR Code for this record

The record appears in these collections:
Institutssammlungen > IAS > IAS-6
Dokumenttypen > Berichte > Vorabdrucke
Workflowsammlungen > Öffentliche Einträge
Publikationsdatenbank

 Datensatz erzeugt am 2025-10-24, letzte Änderung am 2025-11-11


Restricted:
Volltext herunterladen PDF
Externer link:
Volltext herunterladenVolltext
Dieses Dokument bewerten:

Rate this document:
1
2
3
 
(Bisher nicht rezensiert)