001     4749
005     20180208194919.0
024 7 _ |2 DOI
|a 10.1007/s10586-009-0102-2
024 7 _ |2 WOS
|a WOS:000271723100002
037 _ _ |a PreJuSER-4749
041 _ _ |a eng
082 _ _ |a 004
084 _ _ |2 WoS
|a Computer Science, Information Systems
084 _ _ |2 WoS
|a Computer Science, Theory & Methods
100 1 _ |a Riedel, M.
|b 0
|u FZJ
|0 P:(DE-Juel1)132239
245 _ _ |a Research Advances by using Interoperable e-Science Infrastructures - The Infrastructure Interoperability Reference Model applied in e-Science
260 _ _ |a Dordrecht [u.a.]
|b Springer Science + Business Media B.V
|c 2009
300 _ _ |a 357 - 372
336 7 _ |a Journal Article
|0 PUB:(DE-HGF)16
|2 PUB:(DE-HGF)
336 7 _ |a Output Types/Journal article
|2 DataCite
336 7 _ |a Journal Article
|0 0
|2 EndNote
336 7 _ |a ARTICLE
|2 BibTeX
336 7 _ |a JOURNAL_ARTICLE
|2 ORCID
336 7 _ |a article
|2 DRIVER
440 _ 0 |a Cluster Computing : The Journal of Networks, Software Tools and Applications
|x 1386-7857
|0 21481
|y 4
|v 12
500 _ _ |a The work presented here is related to efforts of many talented computer scientists and e-scientists that are funded by various public funding organizations that we thank and acknowledge, because this work would not have been possible without their continuous and sustainable support. We also want to express our gratitudes to the members of the OGF Grid Interoperation Now community group and the OGF Production Grid Infrastructure working group. The work and discussions in these groups significantly supported the progress and adoption of the IIRM in many different ways. We also especially thank the e-scientists of the WISDOM team that include J. Salzemann, A. Da Costa, V. Bloch, V. Breton, M. Hofmann-Apitius, and, most notably, V. Kasam. In the context of the scientific use case of Gridenabled neurosurgical imaging we are deeply thankful to P. Coveney, S. Manos, and S. Zasada. This work is partly funded via the European project DEISA-II that is funded by the European Commission in FP7 under grant agreement RI-222919. Also, we thank the application support team in Julich in supporting our efforts in demonstrating the IIRM use cases at the Supercomputing 2007 and 2008. Our final thanks go to the Forschungszentrum Julich of the Helmholtz association in general and the Julich Supercomputing Centre in particular.
520 _ _ |a Computational simulations and thus scientific computing is the third pillar alongside theory and experiment in todays science. The term e-science evolved as a new research field that focuses on collaboration in key areas of science using next generation computing infrastructures (i.e. co-called e-science infrastructures) to extend the potential of scientific computing. During the past years, significant international and broader interdisciplinary research is increasingly carried out by global collaborations that often share a single e-science infrastructure. More recently, increasing complexity of e-science applications that embrace multiple physical models (i.e. multi-physics) and consider a larger range of scales (i.e. multi-scale) is creating a steadily growing demand for world-wide interoperable infrastructures that allow for new innovative types of e-science by jointly using different kinds of e-science infrastructures. But interoperable infrastructures are still not seamlessly provided today and we argue that this is due to the absence of a realistically implementable infrastructure reference model. Therefore, the fundamental goal of this paper is to provide insights into our proposed infrastructure reference model that represents a trimmed down version of ogsa in terms of functionality and complexity, while on the other hand being more specific and thus easier to implement. The proposed reference model is underpinned with experiences gained from e-science applications that achieve research advances by using interoperable e-science infrastructures.
536 _ _ |a Scientific Computing
|0 G:(DE-Juel1)FUEK411
|c P41
|2 G:(DE-HGF)
|x 0
536 _ _ |a DEISA2 - Distributed European Infrastructure for Supercomputing Applications 2 (222919)
|0 G:(EU-Grant)222919
|c 222919
|x 1
|f FP7-INFRASTRUCTURES-2007-2
588 _ _ |a Dataset connected to Web of Science
650 _ 7 |a J
|2 WoSType
653 2 0 |2 Author
|a e-Science Infrastructures
653 2 0 |2 Author
|a HPC
653 2 0 |2 Author
|a HTC
653 2 0 |2 Author
|a Interoperability
653 2 0 |2 Author
|a Reference Model
653 2 0 |2 Author
|a e-Health
700 1 _ |a Wolf, F.
|b 1
|u FZJ
|0 P:(DE-Juel1)VDB1927
700 1 _ |a Kranzlmüller, D.
|b 2
|0 P:(DE-HGF)0
700 1 _ |a Streit, A.
|b 3
|u FZJ
|0 P:(DE-Juel1)VDB52599
700 1 _ |a Lippert, T.
|b 4
|u FZJ
|0 P:(DE-Juel1)132179
773 _ _ |a 10.1007/s10586-009-0102-2
|g Vol. 12, p. 357 - 372
|p 357 - 372
|q 12<357 - 372
|0 PERI:(DE-600)2012757-1
|t Cluster computing
|v 12
|y 2009
|x 1386-7857
856 7 _ |u http://dx.doi.org/10.1007/s10586-009-0102-2
909 C O |o oai:juser.fz-juelich.de:4749
|p openaire
|p VDB
|p ec_fundedresources
913 1 _ |a DE-HGF
|b Schlüsseltechnologien
|k P41
|l Supercomputing
|0 G:(DE-Juel1)FUEK411
|v Scientific Computing
|x 0
914 1 _ |y 2009
915 _ _ |0 StatID:(DE-HGF)0010
|a JCR/ISI refereed
920 1 _ |0 I:(DE-Juel1)JSC-20090406
|k JSC
|l Jülich Supercomputing Centre
|g JSC
|x 0
920 1 _ |0 I:(DE-82)080012_20140620
|k JARA-HPC
|l Jülich Aachen Research Alliance - High-Performance Computing
|g JARA
|x 1
970 _ _ |a VDB:(DE-Juel1)111953
980 _ _ |a VDB
980 _ _ |a ConvertedRecord
980 _ _ |a journal
980 _ _ |a I:(DE-Juel1)JSC-20090406
980 _ _ |a I:(DE-82)080012_20140620
980 _ _ |a UNRESTRICTED
981 _ _ |a I:(DE-Juel1)VDB1346


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21