001     857588
005     20210129235607.0
020 _ _ |a 978-1-5386-7150-4
024 7 _ |a 10.1109/IGARSS.2018.8519364
|2 doi
024 7 _ |a 2128/20139
|2 Handle
037 _ _ |a FZJ-2018-06573
100 1 _ |a Memon, Mohammad Shahbaz
|0 P:(DE-Juel1)132190
|b 0
|e Corresponding author
|u fzj
111 2 _ |a 2018 IEEE International Geoscience and Remote Sensing Symposium
|g IGARSS 2018
|c Valencia
|d 2018-07-22 - 2018-07-27
|w Spain
245 _ _ |a Automated Analysis of Remotely Sensed Images Using the Unicore Workflow Management System
260 _ _ |c 2018
|b IEEE
300 _ _ |a 1128 - 1131
336 7 _ |a CONFERENCE_PAPER
|2 ORCID
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a conferenceObject
|2 DRIVER
336 7 _ |a Output Types/Conference Paper
|2 DataCite
336 7 _ |a Contribution to a conference proceedings
|b contrib
|m contrib
|0 PUB:(DE-HGF)8
|s 1542797823_6274
|2 PUB:(DE-HGF)
336 7 _ |a Contribution to a book
|0 PUB:(DE-HGF)7
|2 PUB:(DE-HGF)
|m contb
520 _ _ |a The progress of remote sensing technologies leads to increased supply of high-resolution image data. However, solutions for processing large volumes of data are lagging behind: desktop computers cannot cope anymore with the requirements of macro-scale remote sensing applications; therefore, parallel methods running in High-Performance Computing (HPC) environments are essential. Managing an HPC processing pipeline is non-trivial for a scientist, especially when the computing environment is heterogeneous and the set of tasks has complex dependencies. This paper proposes an end-to-end scientific workflow approach based on the UNICORE workflow management system for automating the full chain of Support Vector Machine (SVM)-based classification of remotely sensed images. The high-level nature of UNICORE workflows allows to deal with heterogeneity of HPC computing environments and offers powerful workflow operations such as needed for parameter sweeps. As a result, the remote sensing workflow of SVM-based classification becomes re-usable across different computing environments, thus increasing usability and reducing efforts for a scientist.
536 _ _ |a 512 - Data-Intensive Science and Federated Computing (POF3-512)
|0 G:(DE-HGF)POF3-512
|c POF3-512
|f POF III
|x 0
536 _ _ |0 G:(DE-Juel1)PHD-NO-GRANT-20170405
|x 1
|c PHD-NO-GRANT-20170405
|a PhD no Grant - Doktorand ohne besondere Förderung (PHD-NO-GRANT-20170405)
588 _ _ |a Dataset connected to CrossRef Conference
700 1 _ |a Cavallaro, Gabriele
|0 P:(DE-Juel1)171343
|b 1
|u fzj
700 1 _ |a Hagemeier, Bjorn
|0 P:(DE-Juel1)132123
|b 2
|u fzj
700 1 _ |a Riedel, Morris
|0 P:(DE-Juel1)132239
|b 3
|u fzj
700 1 _ |a Neukirchen, Helmut
|0 P:(DE-HGF)0
|b 4
773 _ _ |a 10.1109/IGARSS.2018.8519364
856 4 _ |y OpenAccess
|u https://juser.fz-juelich.de/record/857588/files/igarss-2018-urc.pdf
856 4 _ |y OpenAccess
|x pdfa
|u https://juser.fz-juelich.de/record/857588/files/igarss-2018-urc.pdf?subformat=pdfa
909 C O |o oai:juser.fz-juelich.de:857588
|p openaire
|p open_access
|p VDB
|p driver
|p dnbdelivery
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)132190
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)171343
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 2
|6 P:(DE-Juel1)132123
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 3
|6 P:(DE-Juel1)132239
913 1 _ |a DE-HGF
|b Key Technologies
|1 G:(DE-HGF)POF3-510
|0 G:(DE-HGF)POF3-512
|2 G:(DE-HGF)POF3-500
|v Data-Intensive Science and Federated Computing
|x 0
|4 G:(DE-HGF)POF
|3 G:(DE-HGF)POF3
|l Supercomputing & Big Data
914 1 _ |y 2018
915 _ _ |a OpenAccess
|0 StatID:(DE-HGF)0510
|2 StatID
920 1 _ |0 I:(DE-Juel1)JSC-20090406
|k JSC
|l Jülich Supercomputing Center
|x 0
980 _ _ |a contrib
980 _ _ |a VDB
980 _ _ |a UNRESTRICTED
980 _ _ |a contb
980 _ _ |a I:(DE-Juel1)JSC-20090406
980 1 _ |a FullTexts


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21