001037898 001__ 1037898
001037898 005__ 20250203103256.0
001037898 037__ $$aFZJ-2025-01036
001037898 1001_ $$0P:(DE-Juel1)203192$$aBencheikh, Wadjih$$b0
001037898 1112_ $$aNeuro-Inspired Computing Elements$$cLa Jolla, California$$d2024-04-23 - 2024-04-26$$wUSA
001037898 245__ $$aTraining Spiking Neural Networks to emulate brain-like activity for optimal efficiency
001037898 260__ $$c2024
001037898 3367_ $$033$$2EndNote$$aConference Paper
001037898 3367_ $$2DataCite$$aOther
001037898 3367_ $$2BibTeX$$aINPROCEEDINGS
001037898 3367_ $$2DRIVER$$aconferenceObject
001037898 3367_ $$2ORCID$$aLECTURE_SPEECH
001037898 3367_ $$0PUB:(DE-HGF)6$$2PUB:(DE-HGF)$$aConference Presentation$$bconf$$mconf$$s1738230140_8094$$xAfter Call
001037898 520__ $$aUnlike neurons in Artificial Neural Networks (ANNs), biological neurons operate in continuous real-timeand thus possess the ability to represent timing information. They use tiny pulses of voltages to communicatewith each other, called spikes. The spikes are generated scarcely and at precise times, especially in earlysensory regions, contributing to the energy efficiency of brain circuitry. Another feature of biologicalneurons is that are equipped with an intrinsic memory of hundreds of milliseconds, which keeps track of recentactivity. These operations have inspired the development of a novel kind of neural network (NNs) known asspiking NNs (SNNs). In this study, we aim to identify the most effective combination of neuron types andlearning methods in SNNs that yield high accuracy while minimizing the firing rate of neurons. We train andanalyze the resulting spiking activity of both a feedforward and a recurrent network of Adaptive LIF (adLIF)neurons. Training of the feedforward connections between the layers includes training both the delays andweights. We implemented the delays using 1D Dilated Convolution with Learnable Spacings (DLSC) . Thetraining process utilizes Back-Propagation Through Time (BPTT) with surrogate gradients. We employ aregularization function to control the population firing rate. This function penalizes deviations from a desiredrange of neuronal activity, comprising terms for hypoactivity (excessively low firing rates) and hyperactivity(excessively high firing rates). Optimal hyperparameters are chosen based on the average highest accuracy on5 different network realizations. The hyperparameters include: dropout rate, connectivity type (feedforwardor recurrent), regularization parameters, maximum delay, and hidden size of the layer. Note that we strictlyconstrain the range of values of the maximum allowed firing rate to reach a regime of high sparsity. We evaluateour network performance on classifying digits from the Spiking Heidelberg Dataset (SHD). The dataset waspostprocessed using a temporal bin of 10ms and the training consists of 50 epochs. Our analysis shows that therecurrent network including trainable delays in the feedforward connections demonstrates the highest accuracyand the lowest firing rate (accuracy = 91.05%, firing rate = 1.02Hz), where the firing rate is computed as theaverage of the population mean spikes count per second. The corresponding spiking activity exhibits spikingbursts. This is problematic as this limits the use of spatiotemporal patterns, increases network latency in termsof information processing, and is hardly represented by networks implemented on neuromorphic hardware. Toreduce bursting neurons, we incorporate a refractory period which resulted in a firing rate of 1.56Hz. However,this came at the cost of a drop in accuracy (88.82%). Future work will assess the reasons behind this latter lossand devise alternative implementation methods and novel learning methodologies. Finally, we employed SpikePattern Detection and Evaluation (SPADE) in our analysis. Despite our efforts, we have not yet identifiedsignificant recurrent spatiotemporal patterns within the spiking activity of the network. It remains to be shownin a future study whether the representations and patterns developed by SNNs trained by means of the BPTTalgorithm resemble those observed in biological neural networks.
001037898 536__ $$0G:(DE-HGF)POF4-5234$$a5234 - Emerging NC Architectures (POF4-523)$$cPOF4-523$$fPOF IV$$x0
001037898 536__ $$0G:(DE-82)BMBF-16ME0398K$$aBMBF 16ME0398K - Verbundprojekt: Neuro-inspirierte Technologien der künstlichen Intelligenz für die Elektronik der Zukunft - NEUROTEC II - (BMBF-16ME0398K)$$cBMBF-16ME0398K$$x1
001037898 536__ $$0G:(DE-82)BMBF-16ME0399$$aBMBF 16ME0399 - Verbundprojekt: Neuro-inspirierte Technologien der künstlichen Intelligenz für die Elektronik der Zukunft - NEUROTEC II - (BMBF-16ME0399)$$cBMBF-16ME0399$$x2
001037898 7001_ $$0P:(DE-Juel1)188273$$aNeftci, Emre$$b1
001037898 7001_ $$0P:(DE-Juel1)176778$$aBouhadjar, Younes$$b2$$eCorresponding author
001037898 909CO $$ooai:juser.fz-juelich.de:1037898$$pVDB
001037898 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)188273$$aForschungszentrum Jülich$$b1$$kFZJ
001037898 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)176778$$aForschungszentrum Jülich$$b2$$kFZJ
001037898 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5234$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
001037898 9141_ $$y2024
001037898 920__ $$lyes
001037898 9201_ $$0I:(DE-Juel1)PGI-15-20210701$$kPGI-15$$lNeuromorphic Software Eco System$$x0
001037898 9201_ $$0I:(DE-Juel1)PGI-7-20110106$$kPGI-7$$lElektronische Materialien$$x1
001037898 980__ $$aconf
001037898 980__ $$aVDB
001037898 980__ $$aI:(DE-Juel1)PGI-15-20210701
001037898 980__ $$aI:(DE-Juel1)PGI-7-20110106
001037898 980__ $$aUNRESTRICTED