000907337 001__ 907337
000907337 005__ 20240313094955.0
000907337 037__ $$aFZJ-2022-01972
000907337 041__ $$aEnglish
000907337 1001_ $$0P:(DE-Juel1)176778$$aBouhadjar, Younes$$b0$$eCorresponding author$$ufzj
000907337 1112_ $$aMaterials, devices and systems for neuromorphic computing conference$$cGroningen$$d2022-03-28 - 2022-03-29$$gMatNeC$$wNetherlands
000907337 245__ $$aSequence learning in a spiking neural network with memristive synapses
000907337 260__ $$c2022
000907337 3367_ $$033$$2EndNote$$aConference Paper
000907337 3367_ $$2BibTeX$$aINPROCEEDINGS
000907337 3367_ $$2DRIVER$$aconferenceObject
000907337 3367_ $$2ORCID$$aCONFERENCE_POSTER
000907337 3367_ $$2DataCite$$aOutput Types/Conference Poster
000907337 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1654774546_15301$$xAfter Call
000907337 502__ $$cRWTH Aachen
000907337 520__ $$aBrain-inspired computing proposes a set of algorithmic principles that hold promise for advancing artificial intelligence. They endow systems with self-learning capabilities, efficient energy usage, and high storage capacity. A core concept that lies at the heart of brain computation is sequence learning and prediction. This form of computation is essential for almost all our daily tasks such as movement generation, perception, and language. Understanding how the brain performs such a computation is not only important to advance neuroscience but also to pave the way to new technological brain-inspired applications. A previously developed spiking neural network implementation of sequence prediction and recall learns complex, high-order sequences in an unsupervised manner by local, biologically inspired plasticity rules. An emerging type of hardware that holds promise for efficiently running this type of algorithm is analog neuromorphic hardware (ANH). It emulates the brain architecture and maps neurons and synapses directly into a physical substrate. Memristive devices have been identified as potential synaptic elements in ANH. In particular, redox-induced resistive random access memories (ReRAM) devices stand out at many aspects. They permit scalability, are energy-efficient and fast, and can implement biological learning rules. In this work, we study the feasibility of using ReRAM devices as a replacement of the biological synapses in the sequence learning model. We implement and simulate the model including the ReRAM plasticity using the neural simulator NEST. We investigate two types of ReRAM devices: (i) an analog switching memristive device, where the conductance gradually changes between a low conductance (LCS) and a high conductance state (HCS), and (ii) a binary switching memristive device, where the conductance abruptly changes between the LCS and the HCS. We study the performance characteristics of the sequence learning model as a function of different device properties, and demonstrate resilience with respect to different on/off ratios, conductance resolutions, device variability, and synaptic failure.
000907337 536__ $$0G:(DE-HGF)POF3-574$$a574 - Theory, modelling and simulation (POF3-574)$$cPOF3-574$$fPOF III$$x0
000907337 536__ $$0G:(DE-HGF)POF4-5232$$a5232 - Computational Principles (POF4-523)$$cPOF4-523$$fPOF IV$$x1
000907337 536__ $$0G:(DE-Juel1)aca_20190115$$aAdvanced Computing Architectures (aca_20190115)$$caca_20190115$$fAdvanced Computing Architectures$$x2
000907337 536__ $$0G:(EU-Grant)945539$$aHBP SGA3 - Human Brain Project Specific Grant Agreement 3 (945539)$$c945539$$fH2020-SGA-FETFLAG-HBP-2019$$x3
000907337 536__ $$0G:(EU-Grant)785907$$aHBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907)$$c785907$$fH2020-SGA-FETFLAG-HBP-2017$$x4
000907337 588__ $$aDataset connected to DataCite
000907337 7001_ $$0P:(DE-Juel1)174486$$aSiegel, Sebastian$$b1$$ufzj
000907337 7001_ $$0P:(DE-Juel1)145211$$aTetzlaff, Tom$$b2$$ufzj
000907337 7001_ $$0P:(DE-Juel1)144174$$aDiesmann, Markus$$b3$$ufzj
000907337 7001_ $$0P:(DE-Juel1)131022$$aWaser, R.$$b4$$ufzj
000907337 7001_ $$0P:(DE-HGF)0$$aWouters, Dirk J.$$b5
000907337 909CO $$ooai:juser.fz-juelich.de:907337$$pec_fundedresources$$pVDB$$popenaire
000907337 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)176778$$aForschungszentrum Jülich$$b0$$kFZJ
000907337 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)174486$$aForschungszentrum Jülich$$b1$$kFZJ
000907337 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)145211$$aForschungszentrum Jülich$$b2$$kFZJ
000907337 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)144174$$aForschungszentrum Jülich$$b3$$kFZJ
000907337 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)131022$$aForschungszentrum Jülich$$b4$$kFZJ
000907337 9130_ $$0G:(DE-HGF)POF3-574$$1G:(DE-HGF)POF3-570$$2G:(DE-HGF)POF3-500$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bKey Technologies$$lDecoding the Human Brain$$vTheory, modelling and simulation$$x0
000907337 9131_ $$0G:(DE-HGF)POF4-523$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5232$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vNeuromorphic Computing and Network Dynamics$$x0
000907337 9141_ $$y2022
000907337 920__ $$lyes
000907337 9201_ $$0I:(DE-Juel1)INM-6-20090406$$kINM-6$$lComputational and Systems Neuroscience$$x0
000907337 9201_ $$0I:(DE-Juel1)IAS-6-20130828$$kIAS-6$$lTheoretical Neuroscience$$x1
000907337 9201_ $$0I:(DE-Juel1)INM-10-20170113$$kINM-10$$lJara-Institut Brain structure-function relationships$$x2
000907337 9201_ $$0I:(DE-Juel1)PGI-7-20110106$$kPGI-7$$lElektronische Materialien$$x3
000907337 9201_ $$0I:(DE-Juel1)PGI-10-20170113$$kPGI-10$$lJARA Institut Green IT$$x4
000907337 980__ $$aposter
000907337 980__ $$aVDB
000907337 980__ $$aI:(DE-Juel1)INM-6-20090406
000907337 980__ $$aI:(DE-Juel1)IAS-6-20130828
000907337 980__ $$aI:(DE-Juel1)INM-10-20170113
000907337 980__ $$aI:(DE-Juel1)PGI-7-20110106
000907337 980__ $$aI:(DE-Juel1)PGI-10-20170113
000907337 980__ $$aUNRESTRICTED
000907337 981__ $$aI:(DE-Juel1)IAS-6-20130828