Conference Presentation (Plenary/Keynote) FZJ-2024-05144

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Large-scale network models as digital twins advance theory and neuromorphic computing



2024

Pucon Learning and AI Summit, PLENA, PuconPucon, Chile, 8 Apr 2024 - 12 Apr 20242024-04-082024-04-12

Abstract: Large-scale brain models as digital twins advance theory and neuromorphic computingMarkus DiesmannComputational neuroscience is entering a new era. This originates from the convergence of two developments: First, biological knowledge has expanded, enabling the construction of anatomically detailed models of one or multiple brain areas. The models are formulated at the resolution of individual nerve cells (neurons), represent the respective part of the brain with its natural number of neurons, and are multi-scale. Next to the spiking activity of neurons, also mesoscopic signals like the local field potential (LFP) and fMRI signals can be generated (e.g. [1]). Second, with the completion of the European Human Brain Project (HBP), simulation has firmly established itself in neuroscience as a third pillar alongside experiment and theory. A conceptual separation has been achieved between concrete network models and generic simulation engines [2,3]. Many different models can be simulated with the same engine, such that these codes can continuously be optimized [4] and operated as an infrastructure. Network models with millions of neurons can routinely be investigated (e.g. [4]). Neuroscientists can now work with digital twins of certain brain structures to test their ideas on brain functions and probe the validity of approximations required for analytical approaches.However, the efficient use of this capability also requires a change in mindset. Computational neuroscience seems stuck at a certain level of model complexity for the last decade not only because anatomical data were missing or because of a lack of simulation technology. The fascination of the field with minimal models leads to explanations for individual mechanisms, but the reduction to the bare equations required provides researchers with few contact points to build on these works and construct larger systems with a wider explanatory scope. In addition, creating large-scale models goes beyond the period of an individual PhD project, but an exclusive focus on hypothesis-driven research may prevent such sustained constructive work. Possibly, researchers may also just be missing the digital workflows to reuse large-scale models and extend them reproducibly. The change of perspective required is to view digital twins as research platforms and scientific software as infrastructure.As a concrete example, the presentation discusses how the universality of mammalian brain structures motivates the construction of large-scale models and demonstrates how digital workflows help to reproduce results and increase the confidence in such models. A digital twin promotes neuroscientific investigations, but can also serve as a benchmark for technology. The energy consumption of present AI systems is unsustainable and undemocratic. Understanding the energy efficiency of the brain may uncover pathways out of the dilemma. The talk shows how a model of the cortical microcircuit has become a de facto standard for neuromorphic computing [5] and has sparked a constructive race in the community for ever larger computation speed and lower energy consumption.[1] Senk J, Hagen E, van Albada SJ, Diesmann M (2018) Reconciliation of weak pairwise spike-train correlations and highly coherent local field potentials across space. arXiv:1805.10235 [q-bio.NC][2] Einevoll GT, Destexhe A, Diesmann M, Grün S, Jirsa V, de Kamps M, Migliore M, Ness TV, Plesser HE, Schürmann F (2019) The Scientific Case for Brain Simulations. Neuron 102:735-744[3] Senk J, Kriener B, Djurfeldt M, Voges N, Jiang HJ, Schüttler L, Gramelsberger G, Diesmann M, Plesser HE, van Albada SJ (2022) Connectivity concepts in neuronal network modeling. PLOS Comput Biol 18(9):e1010086[4] Tiddia G, Golosio B, Albers J, Senk J, Simula F, Pronold J, Fanti V, Pastorelli E, Paolucci PS, van Albada SJ (2022) Fast Simulation of a Multi-Area Spiking Network Model of Macaque Cortex on an MPI-GPU Cluster. Front Neuroinform 16:883333 [5] Kurth AC, Senk J, Terhorst D, Finnerty J, Diesmann M (2022) Sub-realtime simulation of a neuronal network of natural density. Neuromorphic Computing and Engineering 2:021001keywords: simulation as third pillar, software as infrastructure, universality of cortex, cellular-resolution cortical microcircuit, multi-area model, neuromorphic computing


Contributing Institute(s):
  1. Computational and Systems Neuroscience (IAS-6)
  2. Jara-Institut Brain structure-function relationships (INM-10)
Research Program(s):
  1. 5232 - Computational Principles (POF4-523) (POF4-523)
  2. 5234 - Emerging NC Architectures (POF4-523) (POF4-523)
  3. 5235 - Digitization of Neuroscience and User-Community Building (POF4-523) (POF4-523)
  4. ACA - Advanced Computing Architectures (SO-092) (SO-092)
  5. EBRAINS 2.0 - EBRAINS 2.0: A Research Infrastructure to Advance Neuroscience and Brain Health (101147319) (101147319)
  6. HBP - The Human Brain Project (604102) (604102)
  7. Helmholtz Platform for Research Software Engineering - Preparatory Study (HiRSE_PS-20220812) (HiRSE_PS-20220812)
  8. Brain-Scale Simulations (jinb33_20220812) (jinb33_20220812)
  9. BMBF 03ZU1106CB - NeuroSys: Algorithm-Hardware Co-Design (Projekt C) - B (BMBF-03ZU1106CB) (BMBF-03ZU1106CB)
  10. JL SMHB - Joint Lab Supercomputing and Modeling for the Human Brain (JL SMHB-2021-2027) (JL SMHB-2021-2027)

Appears in the scientific report 2024
Click to display QR Code for this record

The record appears in these collections:
Document types > Presentations > Conference Presentations
Institute Collections > INM > INM-10
Institute Collections > IAS > IAS-6
Workflow collections > Public records
Publications database

 Record created 2024-08-02, last modified 2024-11-08



Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)