| 001 | 171937 | ||
| 005 | 20250314084111.0 | ||
| 024 | 7 | _ | |2 doi |a 10.5194/gmdd-7-3545-2014 |
| 024 | 7 | _ | |2 ISSN |a 1991-9611 |
| 024 | 7 | _ | |2 ISSN |a 1991-962X |
| 024 | 7 | _ | |2 Handle |a 2128/8046 |
| 037 | _ | _ | |a FZJ-2014-05491 |
| 082 | _ | _ | |a 910 |
| 100 | 1 | _ | |0 P:(DE-Juel1)159138 |a Gasper, F. |b 0 |e Corresponding Author |u fzj |
| 245 | _ | _ | |a Implementation and scaling of the fully coupled Terrestrial Systems Modeling Platform (TerrSysMP) in a massively parallel supercomputing environment – a case study on JUQUEEN (IBM Blue Gene/Q) |
| 260 | _ | _ | |a Katlenburg-Lindau |b Copernicus |c 2014 |
| 336 | 7 | _ | |0 PUB:(DE-HGF)16 |2 PUB:(DE-HGF) |a Journal Article |b journal |m journal |s 1417088234_3097 |
| 336 | 7 | _ | |2 DataCite |a Output Types/Journal article |
| 336 | 7 | _ | |0 0 |2 EndNote |a Journal Article |
| 336 | 7 | _ | |2 BibTeX |a ARTICLE |
| 336 | 7 | _ | |2 ORCID |a JOURNAL_ARTICLE |
| 336 | 7 | _ | |2 DRIVER |a article |
| 520 | _ | _ | |a Continental-scale hyper-resolution simulations constitute a grand challenge in characterizing non-linear feedbacks of states and fluxes of the coupled water, energy, and biogeochemical cycles of terrestrial systems. Tackling this challenge requires advanced coupling and supercomputing technologies for earth system models that are discussed in this study, utilizing the example of the implementation of the newly developed Terrestrial Systems Modeling Platform (TerrSysMP) on JUQUEEN (IBM Blue Gene/Q) of the Jülich Supercomputing Centre, Germany. The applied coupling strategies rely on the Multiple Program Multiple Data (MPMD) paradigm and require memory and load balancing considerations in the exchange of the coupling fields between different component models and allocation of computational resources, respectively. These considerations can be reached with advanced profiling and tracing tools leading to the efficient use of massively parallel computing environments, which is then mainly determined by the parallel performance of individual component models. However, the problem of model I/O and initialization in the peta-scale range requires major attention, because this constitutes a true big data challenge in the perspective of future exa-scale capabilities, which is unsolved. |
| 536 | _ | _ | |0 G:(DE-HGF)POF2-246 |a 246 - Modelling and Monitoring Terrestrial Systems: Methods and Technologies (POF2-246) |c POF2-246 |f POF II |x 0 |
| 536 | _ | _ | |0 G:(DE-HGF)POF3-255 |f POF III |x 1 |c POF3-255 |a 255 - Terrestrial Systems: From Observation to Prediction (POF3-255) |
| 536 | _ | _ | |0 G:(DE-HGF)POF2-411 |a 411 - Computational Science and Mathematical Methods (POF2-411) |c POF2-411 |f POF II |x 2 |
| 536 | _ | _ | |0 G:(DE-Juel-1)ATMLPP |a ATMLPP - ATML Parallel Performance (ATMLPP) |c ATMLPP |x 3 |
| 588 | _ | _ | |a Dataset connected to CrossRef, juser.fz-juelich.de |
| 700 | 1 | _ | |0 P:(DE-HGF)0 |a Goergen, K. |b 1 |
| 700 | 1 | _ | |0 P:(DE-Juel1)151405 |a Kollet, S. |b 2 |u fzj |
| 700 | 1 | _ | |0 P:(DE-HGF)0 |a Shrestha, P. |b 3 |
| 700 | 1 | _ | |0 P:(DE-HGF)0 |a Sulis, M. |b 4 |
| 700 | 1 | _ | |0 P:(DE-HGF)0 |a Rihani, J. |b 5 |
| 700 | 1 | _ | |0 P:(DE-Juel1)132112 |a Geimer, M. |b 6 |u fzj |
| 773 | _ | _ | |0 PERI:(DE-600)2456729-2 |a 10.5194/gmdd-7-3545-2014 |g Vol. 7, no. 3, p. 3545 - 3573 |n 3 |p 3545 - 3573 |t Geoscientific model development discussions |v 7 |x 1991-962X |y 2014 |
| 856 | 4 | _ | |u https://juser.fz-juelich.de/record/171937/files/FZJ-2014-05491.pdf |y OpenAccess |
| 909 | C | O | |o oai:juser.fz-juelich.de:171937 |p openaire |p open_access |p OpenAPC |p driver |p VDB |p openCost |p dnbdelivery |
| 910 | 1 | _ | |0 I:(DE-588b)5008462-8 |6 P:(DE-Juel1)159138 |a Forschungszentrum Jülich GmbH |b 0 |k FZJ |
| 910 | 1 | _ | |0 I:(DE-588b)5008462-8 |6 P:(DE-Juel1)151405 |a Forschungszentrum Jülich GmbH |b 2 |k FZJ |
| 910 | 1 | _ | |0 I:(DE-588b)5008462-8 |6 P:(DE-Juel1)132112 |a Forschungszentrum Jülich GmbH |b 6 |k FZJ |
| 913 | 2 | _ | |0 G:(DE-HGF)POF3-255 |1 G:(DE-HGF)POF3-250 |2 G:(DE-HGF)POF3-200 |a DE-HGF |b POF III |l Marine, Küsten- und Polare Systeme |v Terrestrische Umwelt |x 0 |
| 913 | 2 | _ | |0 G:(DE-HGF)POF3-511 |1 G:(DE-HGF)POF3-510 |2 G:(DE-HGF)POF3-500 |a DE-HGF |b POF III |l Key Technologies |v Supercomputing & Big Data |x 1 |
| 913 | 1 | _ | |0 G:(DE-HGF)POF2-246 |1 G:(DE-HGF)POF2-240 |2 G:(DE-HGF)POF2-200 |a DE-HGF |b Erde und Umwelt |l Terrestrische Umwelt |v Modelling and Monitoring Terrestrial Systems: Methods and Technologies |x 0 |4 G:(DE-HGF)POF |3 G:(DE-HGF)POF2 |
| 913 | 1 | _ | |9 G:(DE-HGF)POF3-255 |a DE-HGF |x 1 |v Terrestrial Systems: From Observation to Prediction |1 G:(DE-HGF)POF3-250 |0 G:(DE-HGF)POF3-255 |2 G:(DE-HGF)POF3-200 |l Terrestrische Umwelt |4 G:(DE-HGF)POF |3 G:(DE-HGF)POF3 |b Erde und Umwelt |
| 913 | 1 | _ | |0 G:(DE-HGF)POF2-411 |1 G:(DE-HGF)POF2-410 |2 G:(DE-HGF)POF2-400 |a DE-HGF |b Schlüsseltechnologien |l Supercomputing |v Computational Science and Mathematical Methods |x 2 |4 G:(DE-HGF)POF |3 G:(DE-HGF)POF2 |
| 914 | 1 | _ | |y 2014 |
| 915 | _ | _ | |0 LIC:(DE-HGF)CCBY3 |2 HGFVOC |a Creative Commons Attribution CC BY 3.0 |
| 915 | _ | _ | |0 StatID:(DE-HGF)0510 |2 StatID |a OpenAccess |
| 915 | _ | _ | |0 StatID:(DE-HGF)0500 |2 StatID |a DBCoverage |b DOAJ |
| 920 | 1 | _ | |0 I:(DE-Juel1)IBG-3-20101118 |k IBG-3 |l Agrosphäre |x 0 |
| 920 | 1 | _ | |0 I:(DE-Juel1)JSC-20090406 |k JSC |l Jülich Supercomputing Center |x 1 |
| 980 | 1 | _ | |a FullTexts |
| 980 | _ | _ | |a journal |
| 980 | _ | _ | |a VDB |
| 980 | _ | _ | |a I:(DE-Juel1)IBG-3-20101118 |
| 980 | _ | _ | |a I:(DE-Juel1)JSC-20090406 |
| 980 | _ | _ | |a UNRESTRICTED |
| 980 | _ | _ | |a FullTexts |
| 980 | _ | _ | |a APC |
| 981 | _ | _ | |a I:(DE-Juel1)JSC-20090406 |
| Library | Collection | CLSMajor | CLSMinor | Language | Author |
|---|