TY - THES
AU - Korcsak-Gorzo, Agnes
TI - Functions of spiking neural networks constrained by biology
VL - 119
PB - RWTH Aachen University
VL - Dissertation
CY - Jülich
M1 - FZJ-2026-00306
SN - 978-3-95806-876-6
T2 - Schriften des Forschungszentrums Jülich Reihe Information / Information
SP - xvi, 145
PY - 2025
N1 - Dissertation, RWTH Aachen University, 2025
AB - Artificial intelligence (AI) solutions are increasingly taking on tasks traditionally performed by humans. However, their rising computational demands and energy consumption are unsustainable, highlighting the need for more efficient designs. The human brain, evolved to function effectively even when energy is scarce, offers inspiration. Since learning is central to both artificial intelligence and the brain, insights about its underlying principles can deepen our understanding of human learning while informing the development of algorithms that transcend purely engineering-based methods. This thesis investigates biological learning through two studies, examining it from mechanistic and functional perspectives at an abstraction level commonly employed in neurophysics and computational neuroscience. These fields distill complex neural systems and phenomena into tractable mathematical and computational models, enabling insights beyond the reach of traditional biological approaches. Recognizing that synapses — the connections between neurons — are fundamental to learning, the thesis begins with a review of state-of-the-art computational neuroscience methods for modeling synaptic organization. This review highlights critical aspects of synaptic signaling, including connectivity, transmission, plasticity, and heterogeneity. In the first study, a synaptic plasticity model is integrated into a spiking neural network simulator and extended with biologically plausible features, for example, continuous dynamics and increased locality. The effectiveness of this enhanced model is demonstrated by training it on a standard neuromorphic benchmark task, incorporating biologically realistic sparse connectivity and weight constraints. The second study demonstrates that the sampling efficiency of pre-trained spiking neural networks can be enhanced by exposing them to oscillating background spiking activity. Analogous to simulated tempering, these rhythmic oscillations modulate state space exploration, facilitating transitions between high-probability states within the learned representation. These findings establish a link between cortical oscillations and sampling-based computations, offering new insights into memory retrieval and consolidation from a computational perspective. The research involves developing mathematical and computational models, which are simulated on high-performance computing systems, evaluating learning and sampling performance using standard machine learning metrics, and assessing computational efficiency by analyzing runtime. This thesis shows how biologically inspired mechanisms enhance the functional capabilities of spiking neural networks and how they can guide the development of scalable and efficient AI systems.
LB - PUB:(DE-HGF)3 ; PUB:(DE-HGF)11
DO - DOI:10.34734/FZJ-2026-00306
UR - https://juser.fz-juelich.de/record/1050546
ER -