000004749 001__ 4749
000004749 005__ 20180208194919.0
000004749 0247_ $$2DOI$$a10.1007/s10586-009-0102-2
000004749 0247_ $$2WOS$$aWOS:000271723100002
000004749 037__ $$aPreJuSER-4749
000004749 041__ $$aeng
000004749 082__ $$a004
000004749 084__ $$2WoS$$aComputer Science, Information Systems
000004749 084__ $$2WoS$$aComputer Science, Theory & Methods
000004749 1001_ $$0P:(DE-Juel1)132239$$aRiedel, M.$$b0$$uFZJ
000004749 245__ $$aResearch Advances by using Interoperable e-Science Infrastructures - The Infrastructure Interoperability Reference Model applied in e-Science
000004749 260__ $$aDordrecht [u.a.]$$bSpringer Science + Business Media B.V$$c2009
000004749 300__ $$a357 - 372
000004749 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article
000004749 3367_ $$2DataCite$$aOutput Types/Journal article
000004749 3367_ $$00$$2EndNote$$aJournal Article
000004749 3367_ $$2BibTeX$$aARTICLE
000004749 3367_ $$2ORCID$$aJOURNAL_ARTICLE
000004749 3367_ $$2DRIVER$$aarticle
000004749 440_0 $$021481$$aCluster Computing : The Journal of Networks, Software Tools and Applications$$v12$$x1386-7857$$y4
000004749 500__ $$aThe work presented here is related to efforts of many talented computer scientists and e-scientists that are funded by various public funding organizations that we thank and acknowledge, because this work would not have been possible without their continuous and sustainable support. We also want to express our gratitudes to the members of the OGF Grid Interoperation Now community group and the OGF Production Grid Infrastructure working group. The work and discussions in these groups significantly supported the progress and adoption of the IIRM in many different ways. We also especially thank the e-scientists of the WISDOM team that include J. Salzemann, A. Da Costa, V. Bloch, V. Breton, M. Hofmann-Apitius, and, most notably, V. Kasam. In the context of the scientific use case of Gridenabled neurosurgical imaging we are deeply thankful to P. Coveney, S. Manos, and S. Zasada. This work is partly funded via the European project DEISA-II that is funded by the European Commission in FP7 under grant agreement RI-222919. Also, we thank the application support team in Julich in supporting our efforts in demonstrating the IIRM use cases at the Supercomputing 2007 and 2008. Our final thanks go to the Forschungszentrum Julich of the Helmholtz association in general and the Julich Supercomputing Centre in particular.
000004749 520__ $$aComputational simulations and thus scientific computing is the third pillar alongside theory and experiment in todays science. The term e-science evolved as a new research field that focuses on collaboration in key areas of science using next generation computing infrastructures (i.e. co-called e-science infrastructures) to extend the potential of scientific computing. During the past years, significant international and broader interdisciplinary research is increasingly carried out by global collaborations that often share a single e-science infrastructure. More recently, increasing complexity of e-science applications that embrace multiple physical models (i.e. multi-physics) and consider a larger range of scales (i.e. multi-scale) is creating a steadily growing demand for world-wide interoperable infrastructures that allow for new innovative types of e-science by jointly using different kinds of e-science infrastructures. But interoperable infrastructures are still not seamlessly provided today and we argue that this is due to the absence of a realistically implementable infrastructure reference model. Therefore, the fundamental goal of this paper is to provide insights into our proposed infrastructure reference model that represents a trimmed down version of ogsa in terms of functionality and complexity, while on the other hand being more specific and thus easier to implement. The proposed reference model is underpinned with experiences gained from e-science applications that achieve research advances by using interoperable e-science infrastructures.
000004749 536__ $$0G:(DE-Juel1)FUEK411$$2G:(DE-HGF)$$aScientific Computing$$cP41$$x0
000004749 536__ $$0G:(EU-Grant)222919$$aDEISA2 - Distributed European Infrastructure for Supercomputing Applications 2 (222919)$$c222919$$fFP7-INFRASTRUCTURES-2007-2$$x1
000004749 588__ $$aDataset connected to Web of Science
000004749 650_7 $$2WoSType$$aJ
000004749 65320 $$2Author$$ae-Science Infrastructures
000004749 65320 $$2Author$$aHPC
000004749 65320 $$2Author$$aHTC
000004749 65320 $$2Author$$aInteroperability
000004749 65320 $$2Author$$aReference Model
000004749 65320 $$2Author$$ae-Health
000004749 7001_ $$0P:(DE-Juel1)VDB1927$$aWolf, F.$$b1$$uFZJ
000004749 7001_ $$0P:(DE-HGF)0$$aKranzlmüller, D.$$b2
000004749 7001_ $$0P:(DE-Juel1)VDB52599$$aStreit, A.$$b3$$uFZJ
000004749 7001_ $$0P:(DE-Juel1)132179$$aLippert, T.$$b4$$uFZJ
000004749 773__ $$0PERI:(DE-600)2012757-1$$a10.1007/s10586-009-0102-2$$gVol. 12, p. 357 - 372$$p357 - 372$$q12<357 - 372$$tCluster computing$$v12$$x1386-7857$$y2009
000004749 8567_ $$uhttp://dx.doi.org/10.1007/s10586-009-0102-2
000004749 909CO $$ooai:juser.fz-juelich.de:4749$$pec_fundedresources$$pVDB$$popenaire
000004749 9131_ $$0G:(DE-Juel1)FUEK411$$aDE-HGF$$bSchlüsseltechnologien$$kP41$$lSupercomputing$$vScientific Computing$$x0
000004749 9141_ $$y2009
000004749 915__ $$0StatID:(DE-HGF)0010$$aJCR/ISI refereed
000004749 9201_ $$0I:(DE-Juel1)JSC-20090406$$gJSC$$kJSC$$lJülich Supercomputing Centre$$x0
000004749 9201_ $$0I:(DE-82)080012_20140620$$gJARA$$kJARA-HPC$$lJülich Aachen Research Alliance - High-Performance Computing$$x1
000004749 970__ $$aVDB:(DE-Juel1)111953
000004749 980__ $$aVDB
000004749 980__ $$aConvertedRecord
000004749 980__ $$ajournal
000004749 980__ $$aI:(DE-Juel1)JSC-20090406
000004749 980__ $$aI:(DE-82)080012_20140620
000004749 980__ $$aUNRESTRICTED
000004749 981__ $$aI:(DE-Juel1)VDB1346