000866218 001__ 866218 000866218 005__ 20210130003323.0 000866218 0247_ $$2Handle$$a2128/23250 000866218 037__ $$aFZJ-2019-05385 000866218 1001_ $$0P:(DE-Juel1)161462$$aYegenoglu, Alper$$b0$$eCorresponding author$$ufzj 000866218 1112_ $$aSociety for Neuroscience Meeting 2019$$cChicago$$d2019-10-19 - 2019-10-23$$wUSA 000866218 245__ $$aLearning to Learn on High Performance Computing 000866218 260__ $$c2019 000866218 3367_ $$033$$2EndNote$$aConference Paper 000866218 3367_ $$2BibTeX$$aINPROCEEDINGS 000866218 3367_ $$2DRIVER$$aconferenceObject 000866218 3367_ $$2ORCID$$aCONFERENCE_POSTER 000866218 3367_ $$2DataCite$$aOutput Types/Conference Poster 000866218 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1595507729_6840$$xAfter Call 000866218 520__ $$aThe simulation of biological neural networks (BNN) is essential to neuroscience. The complexity of the brain's structure and activity combined with the practical limits of in-vivo measurements have led to the development of computational models which allow us to decompose, analyze and understand its elements and their interactions.Impressive progress has recently been made in non-spiking but brain-like learning capabilities in ANNs [1, 3]. A substantial part of this progress arises from computing-intense learning-to-learn (L2L) [2, 4, 5] or meta-learning methods. L2L is a specific algorithm for acquiring constraints to improve learning performance. L2L can be decomposed into an optimizee program (such as a Kalman filter) which learns specific tasks and an optimizer algorithm which searches for generalized hyperparameters for the optimizee. The optimizer learns to improve the optimizee’s performance over distinct tasks as measured by a fitness function (Fig 1).We have developed an implementation of L2L on High Performance Computing (HPC) [6] for hyperparameter optimization of spiking BNNs as well as hyperparameter search for general neuroscientific analytics. This tool takes advantage of large-scale parallelization by deploying an ensemble of optimizees to understand and analyze mathematical models of BNNs. Improved performance for structural plasticity has been found in NEST simulations comparing several techniques including gradient descent, cross entropy, and evolutionary strategies. 000866218 536__ $$0G:(DE-HGF)POF3-511$$a511 - Computational Science and Mathematical Methods (POF3-511)$$cPOF3-511$$fPOF III$$x0 000866218 536__ $$0G:(EU-Grant)785907$$aHBP SGA2 - Human Brain Project Specific Grant Agreement 2 (785907)$$c785907$$fH2020-SGA-FETFLAG-HBP-2017$$x1 000866218 536__ $$0G:(DE-Juel1)HGF-SMHB-2013-2017$$aSMHB - Supercomputing and Modelling for the Human Brain (HGF-SMHB-2013-2017)$$cHGF-SMHB-2013-2017$$fSMHB$$x2 000866218 536__ $$0G:(DE-Juel1)CSD-SSD-20190612$$aCSD-SSD - Center for Simulation and Data Science (CSD) - School for Simulation and Data Science (SSD) (CSD-SSD-20190612)$$cCSD-SSD-20190612$$x3 000866218 536__ $$0G:(DE-Juel1)Helmholtz-SLNS$$aSLNS - SimLab Neuroscience (Helmholtz-SLNS)$$cHelmholtz-SLNS$$x4 000866218 536__ $$0G:(DE-Juel1)PHD-NO-GRANT-20170405$$aPhD no Grant - Doktorand ohne besondere Förderung (PHD-NO-GRANT-20170405)$$cPHD-NO-GRANT-20170405$$x5 000866218 7001_ $$0P:(DE-Juel1)165859$$aDiaz, Sandra$$b1$$ufzj 000866218 7001_ $$0P:(DE-Juel1)168169$$aKlijn, Wouter$$b2$$ufzj 000866218 7001_ $$0P:(DE-Juel1)161525$$aPeyser, Alexander$$b3$$ufzj 000866218 7001_ $$0P:(DE-HGF)0$$aSubramoney, Anand$$b4 000866218 7001_ $$0P:(DE-HGF)0$$aMaas, Wolfgang$$b5 000866218 7001_ $$0P:(DE-HGF)0$$aVisconti, Giuseppe$$b6 000866218 7001_ $$0P:(DE-HGF)0$$aHerty, Michael$$b7 000866218 8564_ $$uhttps://juser.fz-juelich.de/record/866218/files/L2LSfN2019_1.pdf$$yOpenAccess 000866218 8564_ $$uhttps://juser.fz-juelich.de/record/866218/files/L2LSfN2019_2.pdf$$yOpenAccess 000866218 8564_ $$uhttps://juser.fz-juelich.de/record/866218/files/L2LSfN2019_1.pdf?subformat=pdfa$$xpdfa$$yOpenAccess 000866218 909CO $$ooai:juser.fz-juelich.de:866218$$pec_fundedresources$$pdriver$$pVDB$$popen_access$$popenaire 000866218 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)161462$$aForschungszentrum Jülich$$b0$$kFZJ 000866218 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)165859$$aForschungszentrum Jülich$$b1$$kFZJ 000866218 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)168169$$aForschungszentrum Jülich$$b2$$kFZJ 000866218 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)161525$$aForschungszentrum Jülich$$b3$$kFZJ 000866218 9131_ $$0G:(DE-HGF)POF3-511$$1G:(DE-HGF)POF3-510$$2G:(DE-HGF)POF3-500$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bKey Technologies$$lSupercomputing & Big Data$$vComputational Science and Mathematical Methods$$x0 000866218 9141_ $$y2019 000866218 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess 000866218 920__ $$lyes 000866218 9201_ $$0I:(DE-Juel1)JSC-20090406$$kJSC$$lJülich Supercomputing Center$$x0 000866218 980__ $$aposter 000866218 980__ $$aVDB 000866218 980__ $$aI:(DE-Juel1)JSC-20090406 000866218 980__ $$aUNRESTRICTED 000866218 9801_ $$aFullTexts