001     1037898
005     20250203103256.0
037 _ _ |a FZJ-2025-01036
100 1 _ |a Bencheikh, Wadjih
|0 P:(DE-Juel1)203192
|b 0
111 2 _ |a Neuro-Inspired Computing Elements
|c La Jolla, California
|d 2024-04-23 - 2024-04-26
|w USA
245 _ _ |a Training Spiking Neural Networks to emulate brain-like activity for optimal efficiency
260 _ _ |c 2024
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a Other
|2 DataCite
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a conferenceObject
|2 DRIVER
336 7 _ |a LECTURE_SPEECH
|2 ORCID
336 7 _ |a Conference Presentation
|b conf
|m conf
|0 PUB:(DE-HGF)6
|s 1738230140_8094
|2 PUB:(DE-HGF)
|x After Call
520 _ _ |a Unlike neurons in Artificial Neural Networks (ANNs), biological neurons operate in continuous real-timeand thus possess the ability to represent timing information. They use tiny pulses of voltages to communicatewith each other, called spikes. The spikes are generated scarcely and at precise times, especially in earlysensory regions, contributing to the energy efficiency of brain circuitry. Another feature of biologicalneurons is that are equipped with an intrinsic memory of hundreds of milliseconds, which keeps track of recentactivity. These operations have inspired the development of a novel kind of neural network (NNs) known asspiking NNs (SNNs). In this study, we aim to identify the most effective combination of neuron types andlearning methods in SNNs that yield high accuracy while minimizing the firing rate of neurons. We train andanalyze the resulting spiking activity of both a feedforward and a recurrent network of Adaptive LIF (adLIF)neurons. Training of the feedforward connections between the layers includes training both the delays andweights. We implemented the delays using 1D Dilated Convolution with Learnable Spacings (DLSC) . Thetraining process utilizes Back-Propagation Through Time (BPTT) with surrogate gradients. We employ aregularization function to control the population firing rate. This function penalizes deviations from a desiredrange of neuronal activity, comprising terms for hypoactivity (excessively low firing rates) and hyperactivity(excessively high firing rates). Optimal hyperparameters are chosen based on the average highest accuracy on5 different network realizations. The hyperparameters include: dropout rate, connectivity type (feedforwardor recurrent), regularization parameters, maximum delay, and hidden size of the layer. Note that we strictlyconstrain the range of values of the maximum allowed firing rate to reach a regime of high sparsity. We evaluateour network performance on classifying digits from the Spiking Heidelberg Dataset (SHD). The dataset waspostprocessed using a temporal bin of 10ms and the training consists of 50 epochs. Our analysis shows that therecurrent network including trainable delays in the feedforward connections demonstrates the highest accuracyand the lowest firing rate (accuracy = 91.05%, firing rate = 1.02Hz), where the firing rate is computed as theaverage of the population mean spikes count per second. The corresponding spiking activity exhibits spikingbursts. This is problematic as this limits the use of spatiotemporal patterns, increases network latency in termsof information processing, and is hardly represented by networks implemented on neuromorphic hardware. Toreduce bursting neurons, we incorporate a refractory period which resulted in a firing rate of 1.56Hz. However,this came at the cost of a drop in accuracy (88.82%). Future work will assess the reasons behind this latter lossand devise alternative implementation methods and novel learning methodologies. Finally, we employed SpikePattern Detection and Evaluation (SPADE) in our analysis. Despite our efforts, we have not yet identifiedsignificant recurrent spatiotemporal patterns within the spiking activity of the network. It remains to be shownin a future study whether the representations and patterns developed by SNNs trained by means of the BPTTalgorithm resemble those observed in biological neural networks.
536 _ _ |a 5234 - Emerging NC Architectures (POF4-523)
|0 G:(DE-HGF)POF4-5234
|c POF4-523
|f POF IV
|x 0
536 _ _ |a BMBF 16ME0398K - Verbundprojekt: Neuro-inspirierte Technologien der künstlichen Intelligenz für die Elektronik der Zukunft - NEUROTEC II - (BMBF-16ME0398K)
|0 G:(DE-82)BMBF-16ME0398K
|c BMBF-16ME0398K
|x 1
536 _ _ |a BMBF 16ME0399 - Verbundprojekt: Neuro-inspirierte Technologien der künstlichen Intelligenz für die Elektronik der Zukunft - NEUROTEC II - (BMBF-16ME0399)
|0 G:(DE-82)BMBF-16ME0399
|c BMBF-16ME0399
|x 2
700 1 _ |a Neftci, Emre
|0 P:(DE-Juel1)188273
|b 1
700 1 _ |a Bouhadjar, Younes
|0 P:(DE-Juel1)176778
|b 2
|e Corresponding author
909 C O |o oai:juser.fz-juelich.de:1037898
|p VDB
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)188273
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)176778
913 1 _ |a DE-HGF
|b Key Technologies
|l Natural, Artificial and Cognitive Information Processing
|1 G:(DE-HGF)POF4-520
|0 G:(DE-HGF)POF4-523
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Neuromorphic Computing and Network Dynamics
|9 G:(DE-HGF)POF4-5234
|x 0
914 1 _ |y 2024
920 _ _ |l yes
920 1 _ |0 I:(DE-Juel1)PGI-15-20210701
|k PGI-15
|l Neuromorphic Software Eco System
|x 0
920 1 _ |0 I:(DE-Juel1)PGI-7-20110106
|k PGI-7
|l Elektronische Materialien
|x 1
980 _ _ |a conf
980 _ _ |a VDB
980 _ _ |a I:(DE-Juel1)PGI-15-20210701
980 _ _ |a I:(DE-Juel1)PGI-7-20110106
980 _ _ |a UNRESTRICTED


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21