001     187139
005     20250314084112.0
024 7 _ |a 10.5194/gmd-7-2531-2014
|2 doi
024 7 _ |a 1991-959X
|2 ISSN
024 7 _ |a 1991-9603
|2 ISSN
024 7 _ |a 2128/8288
|2 Handle
024 7 _ |a WOS:000344730900041
|2 WOS
037 _ _ |a FZJ-2015-00813
082 _ _ |a 910
100 1 _ |a Gasper, F.
|0 P:(DE-Juel1)159138
|b 0
|e Corresponding Author
|u fzj
245 _ _ |a Implementation and scaling of the fully coupled Terrestrial Systems Modeling Platform (TerrSysMP v1.0) in a massively parallel supercomputing environment – a case study on JUQUEEN (IBM Blue Gene/Q)
260 _ _ |a Katlenburg-Lindau
|c 2014
|b Copernicus
336 7 _ |a article
|2 DRIVER
336 7 _ |a Output Types/Journal article
|2 DataCite
336 7 _ |a Journal Article
|b journal
|m journal
|0 PUB:(DE-HGF)16
|s 1662624876_18409
|2 PUB:(DE-HGF)
336 7 _ |a ARTICLE
|2 BibTeX
336 7 _ |a JOURNAL_ARTICLE
|2 ORCID
336 7 _ |a Journal Article
|0 0
|2 EndNote
520 _ _ |a Continental-scale hyper-resolution simulations constitute a grand challenge in characterizing nonlinear feedbacks of states and fluxes of the coupled water, energy, and biogeochemical cycles of terrestrial systems. Tackling this challenge requires advanced coupling and supercomputing technologies for earth system models that are discussed in this study, utilizing the example of the implementation of the newly developed Terrestrial Systems Modeling Platform (TerrSysMP v1.0) on JUQUEEN (IBM Blue Gene/Q) of the Jülich Supercomputing Centre, Germany. The applied coupling strategies rely on the Multiple Program Multiple Data (MPMD) paradigm using the OASIS suite of external couplers, and require memory and load balancing considerations in the exchange of the coupling fields between different component models and the allocation of computational resources, respectively. Using the advanced profiling and tracing tool Scalasca to determine an optimum load balancing leads to a 19% speedup. In massively parallel supercomputer environments, the coupler OASIS-MCT is recommended, which resolves memory limitations that may be significant in case of very large computational domains and exchange fields as they occur in these specific test cases and in many applications in terrestrial research. However, model I/O and initialization in the petascale range still require major attention, as they constitute true big data challenges in light of future exascale computing resources. Based on a factor-two speedup due to compiler optimizations, a refactored coupling interface using OASIS-MCT and an optimum load balancing, the problem size in a weak scaling study can be increased by a factor of 64 from 512 to 32 768 processes while maintaining parallel efficiencies above 80% for the component models.
536 _ _ |a 246 - Modelling and Monitoring Terrestrial Systems: Methods and Technologies (POF2-246)
|0 G:(DE-HGF)POF2-246
|c POF2-246
|f POF II
|x 0
536 _ _ |a 255 - Terrestrial Systems: From Observation to Prediction (POF3-255)
|0 G:(DE-HGF)POF3-255
|c POF3-255
|f POF III
|x 1
536 _ _ |a Scalable Performance Analysis of Large-Scale Parallel Applications (jzam11_20091101)
|0 G:(DE-Juel1)jzam11_20091101
|c jzam11_20091101
|f Scalable Performance Analysis of Large-Scale Parallel Applications
|x 2
536 _ _ |0 G:(DE-Juel-1)ATMLPP
|a ATMLPP - ATML Parallel Performance (ATMLPP)
|c ATMLPP
|x 3
588 _ _ |a Dataset connected to CrossRef, juser.fz-juelich.de
700 1 _ |a Goergen, K.
|0 P:(DE-HGF)0
|b 1
700 1 _ |a Shrestha, P.
|0 P:(DE-HGF)0
|b 2
700 1 _ |a Sulis, M.
|0 P:(DE-HGF)0
|b 3
700 1 _ |a Rihani, J.
|0 P:(DE-HGF)0
|b 4
700 1 _ |a Geimer, M.
|0 P:(DE-Juel1)132112
|b 5
|u fzj
700 1 _ |a Kollet, S.
|0 P:(DE-Juel1)151405
|b 6
|u fzj
773 _ _ |a 10.5194/gmd-7-2531-2014
|g Vol. 7, no. 5, p. 2531 - 2543
|0 PERI:(DE-600)2456725-5
|n 5
|p 2531 - 2543
|t Geoscientific model development
|v 7
|y 2014
|x 1991-9603
856 4 _ |u https://juser.fz-juelich.de/record/187139/files/FZJ-2015-00813.pdf
|y OpenAccess
856 4 _ |u https://juser.fz-juelich.de/record/187139/files/FZJ-2015-00813.jpg?subformat=icon-144
|x icon-144
|y OpenAccess
856 4 _ |u https://juser.fz-juelich.de/record/187139/files/FZJ-2015-00813.jpg?subformat=icon-180
|x icon-180
|y OpenAccess
856 4 _ |u https://juser.fz-juelich.de/record/187139/files/FZJ-2015-00813.jpg?subformat=icon-640
|x icon-640
|y OpenAccess
909 C O |o oai:juser.fz-juelich.de:187139
|p openaire
|p open_access
|p driver
|p VDB:Earth_Environment
|p VDB
|p dnbdelivery
910 1 _ |a Forschungszentrum Jülich GmbH
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)159138
910 1 _ |a Forschungszentrum Jülich GmbH
|0 I:(DE-588b)5008462-8
|k FZJ
|b 5
|6 P:(DE-Juel1)132112
910 1 _ |a Forschungszentrum Jülich GmbH
|0 I:(DE-588b)5008462-8
|k FZJ
|b 6
|6 P:(DE-Juel1)151405
913 1 _ |a DE-HGF
|b Erde und Umwelt
|l Terrestrische Umwelt
|1 G:(DE-HGF)POF2-240
|0 G:(DE-HGF)POF2-246
|3 G:(DE-HGF)POF2
|2 G:(DE-HGF)POF2-200
|4 G:(DE-HGF)POF
|v Modelling and Monitoring Terrestrial Systems: Methods and Technologies
|x 0
913 1 _ |a DE-HGF
|b Erde und Umwelt
|l Terrestrische Umwelt
|1 G:(DE-HGF)POF3-250
|0 G:(DE-HGF)POF3-255
|3 G:(DE-HGF)POF3
|2 G:(DE-HGF)POF3-200
|4 G:(DE-HGF)POF
|v Terrestrial Systems: From Observation to Prediction
|x 1
913 2 _ |a DE-HGF
|b Marine, Küsten- und Polare Systeme
|l Terrestrische Umwelt
|1 G:(DE-HGF)POF3-250
|0 G:(DE-HGF)POF3-255
|2 G:(DE-HGF)POF3-200
|v Terrestrial Systems: From Observation to Prediction
|x 0
914 1 _ |y 2014
915 _ _ |a Creative Commons Attribution CC BY 3.0
|0 LIC:(DE-HGF)CCBY3
|2 HGFVOC
915 _ _ |a DBCoverage
|0 StatID:(DE-HGF)0150
|2 StatID
|b Web of Science Core Collection
915 _ _ |a JCR
|0 StatID:(DE-HGF)0100
|2 StatID
915 _ _ |a IF >= 5
|0 StatID:(DE-HGF)9905
|2 StatID
915 _ _ |a DBCoverage
|0 StatID:(DE-HGF)0500
|2 StatID
|b DOAJ
915 _ _ |a WoS
|0 StatID:(DE-HGF)0111
|2 StatID
|b Science Citation Index Expanded
915 _ _ |a OpenAccess
|0 StatID:(DE-HGF)0510
|2 StatID
915 _ _ |a DBCoverage
|0 StatID:(DE-HGF)1150
|2 StatID
|b Current Contents - Physical, Chemical and Earth Sciences
915 _ _ |a DBCoverage
|0 StatID:(DE-HGF)0199
|2 StatID
|b Thomson Reuters Master Journal List
920 1 _ |0 I:(DE-Juel1)IBG-3-20101118
|k IBG-3
|l Agrosphäre
|x 0
920 1 _ |0 I:(DE-Juel1)NIC-20090406
|k NIC
|l John von Neumann - Institut für Computing
|x 1
920 1 _ |0 I:(DE-82)080012_20140620
|k JARA-HPC
|l JARA - HPC
|x 2
980 _ _ |a journal
980 _ _ |a VDB
980 _ _ |a I:(DE-Juel1)IBG-3-20101118
980 _ _ |a I:(DE-Juel1)NIC-20090406
980 _ _ |a I:(DE-82)080012_20140620
980 _ _ |a UNRESTRICTED
980 1 _ |a FullTexts


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21