000863863 001__ 863863
000863863 005__ 20220930130215.0
000863863 0247_ $$2doi$$a10.5194/gmd-12-3001-2019
000863863 0247_ $$2ISSN$$a1991-959X
000863863 0247_ $$2ISSN$$a1991-9603
000863863 0247_ $$2Handle$$a2128/22500
000863863 0247_ $$2WOS$$aWOS:000475466700004
000863863 0247_ $$2altmetric$$aaltmetric:63824760
000863863 037__ $$aFZJ-2019-03834
000863863 082__ $$a550
000863863 1001_ $$0P:(DE-Juel1)132190$$aMemon, Shahbaz$$b0$$eCorresponding author$$ufzj
000863863 245__ $$aScientific workflows applied to the coupling of a continuum (Elmer v8.3) and a discrete element (HiDEM v1.0) ice dynamic model
000863863 260__ $$aKatlenburg-Lindau$$bCopernicus$$c2019
000863863 3367_ $$2DRIVER$$aarticle
000863863 3367_ $$2DataCite$$aOutput Types/Journal article
000863863 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1579529916_30829
000863863 3367_ $$2BibTeX$$aARTICLE
000863863 3367_ $$2ORCID$$aJOURNAL_ARTICLE
000863863 3367_ $$00$$2EndNote$$aJournal Article
000863863 520__ $$aScientific computing applications involving complex simulations and data-intensive processing are often composed of multiple tasks forming a workflow of computing jobs. Scientific communities running such applications on computing resources often find it cumbersome to manage and monitor the execution of these tasks and their associated data. These workflow implementations usually add overhead by introducing unnecessary input/output (I/O) for coupling the models and can lead to sub-optimal CPU utilization. Furthermore, running these workflow implementations in different environments requires significant adaptation efforts, which can hinder the reproducibility of the underlying science. High-level scientific workflow management systems (WMS) can be used to automate and simplify complex task structures by providing tooling for the composition and execution of workflows – even across distributed and heterogeneous computing environments. The WMS approach allows users to focus on the underlying high-level workflow and avoid low-level pitfalls that would lead to non-optimal resource usage while still allowing the workflow to remain portable between different computing environments. As a case study, we apply the UNICORE workflow management system to enable the coupling of a glacier flow model and calving model which contain many tasks and dependencies, ranging from pre-processing and data management to repetitive executions in heterogeneous high-performance computing (HPC) resource environments. Using the UNICORE workflow management system, the composition, management, and execution of the glacier modelling workflow becomes easier with respect to usage, monitoring, maintenance, reusability, portability, and reproducibility in different environments and by different user groups. Last but not least, the workflow helps to speed the runs up by reducing model coupling I/O overhead and it optimizes CPU utilization by avoiding idle CPU cores and running the models in a distributed way on the HPC cluster that best fits the characteristics of each model.
000863863 536__ $$0G:(DE-HGF)POF3-512$$a512 - Data-Intensive Science and Federated Computing (POF3-512)$$cPOF3-512$$fPOF III$$x0
000863863 536__ $$0G:(DE-Juel1)PHD-NO-GRANT-20170405$$aPhD no Grant - Doktorand ohne besondere Förderung (PHD-NO-GRANT-20170405)$$cPHD-NO-GRANT-20170405$$x1
000863863 588__ $$aDataset connected to CrossRef
000863863 7001_ $$0P:(DE-HGF)0$$aVallot, Dorothée$$b1
000863863 7001_ $$00000-0003-3360-4401$$aZwinger, Thomas$$b2
000863863 7001_ $$0P:(DE-HGF)0$$aÅström, Jan$$b3$$eCollaboration author
000863863 7001_ $$0P:(DE-Juel1)169980$$aNeukirchen, Helmut$$b4
000863863 7001_ $$0P:(DE-Juel1)132239$$aRiedel, Morris$$b5$$ufzj
000863863 7001_ $$00000-0003-2472-5201$$aBook, Matthias$$b6
000863863 773__ $$0PERI:(DE-600)2456725-5$$a10.5194/gmd-12-3001-2019$$gVol. 12, no. 7, p. 3001 - 3015$$n7$$p3001 - 3015$$tGeoscientific model development$$v12$$x1991-9603$$y2019
000863863 8564_ $$uhttps://juser.fz-juelich.de/record/863863/files/invoice_Helmholtz-PUC-2019-62.pdf
000863863 8564_ $$uhttps://juser.fz-juelich.de/record/863863/files/gmd-12-3001-2019.pdf$$yOpenAccess
000863863 8564_ $$uhttps://juser.fz-juelich.de/record/863863/files/invoice_Helmholtz-PUC-2019-62.pdf?subformat=pdfa$$xpdfa
000863863 8564_ $$uhttps://juser.fz-juelich.de/record/863863/files/gmd-12-3001-2019.pdf?subformat=pdfa$$xpdfa$$yOpenAccess
000863863 8767_ $$8Helmholtz-PUC-2019-62$$92019-10-01$$d2019-10-07$$eAPC$$jZahlung erfolgt$$pgmd-2018-158
000863863 909CO $$ooai:juser.fz-juelich.de:863863$$popenCost$$pVDB$$pdriver$$pOpenAPC$$popen_access$$popenaire$$pdnbdelivery
000863863 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)132190$$aForschungszentrum Jülich$$b0$$kFZJ
000863863 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)132239$$aForschungszentrum Jülich$$b5$$kFZJ
000863863 9131_ $$0G:(DE-HGF)POF3-512$$1G:(DE-HGF)POF3-510$$2G:(DE-HGF)POF3-500$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bKey Technologies$$lSupercomputing & Big Data$$vData-Intensive Science and Federated Computing$$x0
000863863 9141_ $$y2019
000863863 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS
000863863 915__ $$0LIC:(DE-HGF)CCBY4$$2HGFVOC$$aCreative Commons Attribution CC BY 4.0
000863863 915__ $$0StatID:(DE-HGF)0600$$2StatID$$aDBCoverage$$bEbsco Academic Search
000863863 915__ $$0StatID:(DE-HGF)0100$$2StatID$$aJCR$$bGEOSCI MODEL DEV : 2017
000863863 915__ $$0StatID:(DE-HGF)0501$$2StatID$$aDBCoverage$$bDOAJ Seal
000863863 915__ $$0StatID:(DE-HGF)0500$$2StatID$$aDBCoverage$$bDOAJ
000863863 915__ $$0StatID:(DE-HGF)0111$$2StatID$$aWoS$$bScience Citation Index Expanded
000863863 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection
000863863 915__ $$0StatID:(DE-HGF)9900$$2StatID$$aIF < 5
000863863 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
000863863 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bASC
000863863 915__ $$0StatID:(DE-HGF)1150$$2StatID$$aDBCoverage$$bCurrent Contents - Physical, Chemical and Earth Sciences
000863863 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bClarivate Analytics Master Journal List
000863863 920__ $$lyes
000863863 9201_ $$0I:(DE-Juel1)JSC-20090406$$kJSC$$lJülich Supercomputing Center$$x0
000863863 980__ $$ajournal
000863863 980__ $$aVDB
000863863 980__ $$aI:(DE-Juel1)JSC-20090406
000863863 980__ $$aAPC
000863863 980__ $$aUNRESTRICTED
000863863 9801_ $$aAPC
000863863 9801_ $$aFullTexts