| Home > Publications database > Earth system modeling on modular supercomputing architecture: coupled atmosphere–ocean simulations with ICON 2.6.6-rc > print |
| 001 | 1021120 | ||
| 005 | 20250401102818.0 | ||
| 024 | 7 | _ | |a 10.5194/gmd-17-261-2024 |2 doi |
| 024 | 7 | _ | |a 1991-959X |2 ISSN |
| 024 | 7 | _ | |a 1991-9603 |2 ISSN |
| 024 | 7 | _ | |a 10.34734/FZJ-2024-00574 |2 datacite_doi |
| 024 | 7 | _ | |a WOS:001166577100001 |2 WOS |
| 037 | _ | _ | |a FZJ-2024-00574 |
| 082 | _ | _ | |a 550 |
| 100 | 1 | _ | |a Bishnoi, Abhiraj |0 P:(DE-Juel1)187016 |b 0 |e Corresponding author |
| 245 | _ | _ | |a Earth system modeling on modular supercomputing architecture: coupled atmosphere–ocean simulations with ICON 2.6.6-rc |
| 260 | _ | _ | |a Katlenburg-Lindau |c 2024 |b Copernicus |
| 336 | 7 | _ | |a article |2 DRIVER |
| 336 | 7 | _ | |a Output Types/Journal article |2 DataCite |
| 336 | 7 | _ | |a Journal Article |b journal |m journal |0 PUB:(DE-HGF)16 |s 1707807350_20606 |2 PUB:(DE-HGF) |
| 336 | 7 | _ | |a ARTICLE |2 BibTeX |
| 336 | 7 | _ | |a JOURNAL_ARTICLE |2 ORCID |
| 336 | 7 | _ | |a Journal Article |0 0 |2 EndNote |
| 520 | _ | _ | |a The confrontation of complex Earth system model (ESM) codes with novel supercomputing architectures poses challenges to efficient modeling and job submission strategies. The modular setup of these models naturally fits a modular supercomputing architecture (MSA), which tightly integrates heterogeneous hardware resources into a larger and more flexible high-performance computing (HPC) system. While parts of the ESM codes can easily take advantage of the increased parallelism and communication capabilities of modern GPUs, others lag behind due to the long development cycles or are better suited to run on classical CPUs due to their communication and memory usage patterns. To better cope with these imbalances between the development of the model components, we performed benchmark campaigns on the Jülich Wizard for European Leadership Science (JUWELS) modular HPC system. We enabled the weather and climate model Icosahedral Nonhydrostatic (ICON) to run in a coupled atmosphere–ocean setup, where the ocean and the model I/O is running on the CPU Cluster, while the atmosphere is simulated simultaneously on the GPUs of JUWELS Booster (ICON-MSA). Both atmosphere and ocean are running globally with a resolution of 5 km. In our test case, an optimal configuration in terms of model performance (core hours per simulation day) was found for the combination of 84 GPU nodes on the JUWELS Booster module to simulate the atmosphere and 80 CPU nodes on the JUWELS Cluster module, of which 63 nodes were used for the ocean simulation and the remaining 17 nodes were reserved for I/O. With this configuration the waiting times of the coupler were minimized. Compared to a simulation performed on CPUs only, the MSA approach reduces energy consumption by 45 % with comparable runtimes. ICON-MSA is able to scale up to a significant portion of the JUWELS system, making best use of the available computing resources. A maximum throughput of 170 simulation days per day (SDPD) was achieved when running ICON on 335 JUWELS Booster nodes and 268 Cluster nodes. |
| 536 | _ | _ | |a 5111 - Domain-Specific Simulation & Data Life Cycle Labs (SDLs) and Research Groups (POF4-511) |0 G:(DE-HGF)POF4-5111 |c POF4-511 |f POF IV |x 0 |
| 536 | _ | _ | |a 5122 - Future Computing & Big Data Systems (POF4-512) |0 G:(DE-HGF)POF4-5122 |c POF4-512 |f POF IV |x 1 |
| 536 | _ | _ | |a AIDAS - Joint Virtual Laboratory for AI, Data Analytics and Scalable Simulation (aidas_20200731) |0 G:(DE-Juel-1)aidas_20200731 |c aidas_20200731 |x 2 |
| 588 | _ | _ | |a Dataset connected to CrossRef, Journals: juser.fz-juelich.de |
| 700 | 1 | _ | |a Stein, Olaf |0 P:(DE-Juel1)3709 |b 1 |e Corresponding author |
| 700 | 1 | _ | |a Meyer, Catrin I. |0 P:(DE-Juel1)156465 |b 2 |
| 700 | 1 | _ | |a Redler, René |0 P:(DE-HGF)0 |b 3 |
| 700 | 1 | _ | |a Eicker, Norbert |0 P:(DE-Juel1)132090 |b 4 |u fzj |
| 700 | 1 | _ | |a Haak, Helmuth |0 P:(DE-HGF)0 |b 5 |
| 700 | 1 | _ | |a Hoffmann, Lars |0 P:(DE-Juel1)129125 |b 6 |
| 700 | 1 | _ | |a Klocke, Daniel |0 P:(DE-HGF)0 |b 7 |
| 700 | 1 | _ | |a Kornblueh, Luis |0 P:(DE-HGF)0 |b 8 |
| 700 | 1 | _ | |a Suarez, Estela |0 P:(DE-Juel1)142361 |b 9 |u fzj |
| 773 | _ | _ | |a 10.5194/gmd-17-261-2024 |g Vol. 17, no. 1, p. 261 - 273 |0 PERI:(DE-600)2456725-5 |n 1 |p 261 - 273 |t Geoscientific model development |v 17 |y 2024 |x 1991-959X |
| 856 | 4 | _ | |u https://juser.fz-juelich.de/record/1021120/files/Invoice_Helmholtz-PUC-2024-8.pdf |
| 856 | 4 | _ | |y OpenAccess |u https://juser.fz-juelich.de/record/1021120/files/FZJ-2024-00574.pdf |
| 856 | 4 | _ | |x icon |u https://juser.fz-juelich.de/record/1021120/files/Invoice_Helmholtz-PUC-2024-8.gif?subformat=icon |
| 856 | 4 | _ | |x icon-1440 |u https://juser.fz-juelich.de/record/1021120/files/Invoice_Helmholtz-PUC-2024-8.jpg?subformat=icon-1440 |
| 856 | 4 | _ | |x icon-180 |u https://juser.fz-juelich.de/record/1021120/files/Invoice_Helmholtz-PUC-2024-8.jpg?subformat=icon-180 |
| 856 | 4 | _ | |x icon-640 |u https://juser.fz-juelich.de/record/1021120/files/Invoice_Helmholtz-PUC-2024-8.jpg?subformat=icon-640 |
| 856 | 4 | _ | |y OpenAccess |x icon |u https://juser.fz-juelich.de/record/1021120/files/FZJ-2024-00574.gif?subformat=icon |
| 856 | 4 | _ | |y OpenAccess |x icon-1440 |u https://juser.fz-juelich.de/record/1021120/files/FZJ-2024-00574.jpg?subformat=icon-1440 |
| 856 | 4 | _ | |y OpenAccess |x icon-180 |u https://juser.fz-juelich.de/record/1021120/files/FZJ-2024-00574.jpg?subformat=icon-180 |
| 856 | 4 | _ | |y OpenAccess |x icon-640 |u https://juser.fz-juelich.de/record/1021120/files/FZJ-2024-00574.jpg?subformat=icon-640 |
| 909 | C | O | |o oai:juser.fz-juelich.de:1021120 |p openaire |p open_access |p OpenAPC |p driver |p VDB |p openCost |p dnbdelivery |
| 910 | 1 | _ | |a Forschungszentrum Jülich |0 I:(DE-588b)5008462-8 |k FZJ |b 1 |6 P:(DE-Juel1)3709 |
| 910 | 1 | _ | |a Forschungszentrum Jülich |0 I:(DE-588b)5008462-8 |k FZJ |b 2 |6 P:(DE-Juel1)156465 |
| 910 | 1 | _ | |a External Institute |0 I:(DE-HGF)0 |k Extern |b 3 |6 P:(DE-HGF)0 |
| 910 | 1 | _ | |a Forschungszentrum Jülich |0 I:(DE-588b)5008462-8 |k FZJ |b 4 |6 P:(DE-Juel1)132090 |
| 910 | 1 | _ | |a External Institute |0 I:(DE-HGF)0 |k Extern |b 5 |6 P:(DE-HGF)0 |
| 910 | 1 | _ | |a Forschungszentrum Jülich |0 I:(DE-588b)5008462-8 |k FZJ |b 6 |6 P:(DE-Juel1)129125 |
| 910 | 1 | _ | |a External Institute |0 I:(DE-HGF)0 |k Extern |b 7 |6 P:(DE-HGF)0 |
| 910 | 1 | _ | |a External Institute |0 I:(DE-HGF)0 |k Extern |b 8 |6 P:(DE-HGF)0 |
| 910 | 1 | _ | |a Forschungszentrum Jülich |0 I:(DE-588b)5008462-8 |k FZJ |b 9 |6 P:(DE-Juel1)142361 |
| 913 | 1 | _ | |a DE-HGF |b Key Technologies |l Engineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action |1 G:(DE-HGF)POF4-510 |0 G:(DE-HGF)POF4-511 |3 G:(DE-HGF)POF4 |2 G:(DE-HGF)POF4-500 |4 G:(DE-HGF)POF |v Enabling Computational- & Data-Intensive Science and Engineering |9 G:(DE-HGF)POF4-5111 |x 0 |
| 913 | 1 | _ | |a DE-HGF |b Key Technologies |l Engineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action |1 G:(DE-HGF)POF4-510 |0 G:(DE-HGF)POF4-512 |3 G:(DE-HGF)POF4 |2 G:(DE-HGF)POF4-500 |4 G:(DE-HGF)POF |v Supercomputing & Big Data Infrastructures |9 G:(DE-HGF)POF4-5122 |x 1 |
| 914 | 1 | _ | |y 2024 |
| 915 | p | c | |a APC keys set |0 PC:(DE-HGF)0000 |2 APC |
| 915 | p | c | |a Local Funding |0 PC:(DE-HGF)0001 |2 APC |
| 915 | p | c | |a DFG OA Publikationskosten |0 PC:(DE-HGF)0002 |2 APC |
| 915 | p | c | |a DOAJ Journal |0 PC:(DE-HGF)0003 |2 APC |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0160 |2 StatID |b Essential Science Indicators |d 2023-10-25 |
| 915 | _ | _ | |a Creative Commons Attribution CC BY 4.0 |0 LIC:(DE-HGF)CCBY4 |2 HGFVOC |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0501 |2 StatID |b DOAJ Seal |d 2022-12-20T09:29:04Z |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0500 |2 StatID |b DOAJ |d 2022-12-20T09:29:04Z |
| 915 | _ | _ | |a WoS |0 StatID:(DE-HGF)0113 |2 StatID |b Science Citation Index Expanded |d 2023-10-25 |
| 915 | _ | _ | |a Fees |0 StatID:(DE-HGF)0700 |2 StatID |d 2023-10-25 |
| 915 | _ | _ | |a OpenAccess |0 StatID:(DE-HGF)0510 |2 StatID |
| 915 | _ | _ | |a Article Processing Charges |0 StatID:(DE-HGF)0561 |2 StatID |d 2023-10-25 |
| 915 | _ | _ | |a Peer Review |0 StatID:(DE-HGF)0030 |2 StatID |b DOAJ : Open peer review |d 2022-12-20T09:29:04Z |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0200 |2 StatID |b SCOPUS |d 2024-12-21 |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0300 |2 StatID |b Medline |d 2024-12-21 |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0600 |2 StatID |b Ebsco Academic Search |d 2024-12-21 |
| 915 | _ | _ | |a Peer Review |0 StatID:(DE-HGF)0030 |2 StatID |b ASC |d 2024-12-21 |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0199 |2 StatID |b Clarivate Analytics Master Journal List |d 2024-12-21 |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)1150 |2 StatID |b Current Contents - Physical, Chemical and Earth Sciences |d 2024-12-21 |
| 915 | _ | _ | |a DBCoverage |0 StatID:(DE-HGF)0150 |2 StatID |b Web of Science Core Collection |d 2024-12-21 |
| 920 | _ | _ | |l yes |
| 920 | 1 | _ | |0 I:(DE-Juel1)JSC-20090406 |k JSC |l Jülich Supercomputing Center |x 0 |
| 980 | _ | _ | |a journal |
| 980 | _ | _ | |a VDB |
| 980 | _ | _ | |a UNRESTRICTED |
| 980 | _ | _ | |a I:(DE-Juel1)JSC-20090406 |
| 980 | _ | _ | |a APC |
| 980 | 1 | _ | |a APC |
| 980 | 1 | _ | |a FullTexts |
| Library | Collection | CLSMajor | CLSMinor | Language | Author |
|---|