001     1017949
005     20240403082757.0
024 7 _ |a 10.1109/IGARSS52108.2023.10283284
|2 doi
024 7 _ |a 10.34734/FZJ-2023-04454
|2 datacite_doi
024 7 _ |a WOS:001098971605148
|2 WOS
037 _ _ |a FZJ-2023-04454
100 1 _ |a Sedona, Rocco
|0 P:(DE-Juel1)178695
|b 0
|e Corresponding author
|u fzj
111 2 _ |a IEEE International Geoscience and Remote Sensing Symposium (IGARSS)
|c Pasadena
|d 2023-07-16 - 2023-07-21
|w CA
245 _ _ |a Enhancing Training Set Through Multi-Temporal Attention Analysis in Transformers for Multi-Year Land Cover Mapping
260 _ _ |c 2023
|b IEEE
300 _ _ |a 5411-5414
336 7 _ |a CONFERENCE_PAPER
|2 ORCID
336 7 _ |a Conference Paper
|0 33
|2 EndNote
336 7 _ |a INPROCEEDINGS
|2 BibTeX
336 7 _ |a conferenceObject
|2 DRIVER
336 7 _ |a Output Types/Conference Paper
|2 DataCite
336 7 _ |a Contribution to a conference proceedings
|b contrib
|m contrib
|0 PUB:(DE-HGF)8
|s 1704984848_2197
|2 PUB:(DE-HGF)
520 _ _ |a The continuous stream of high spatial resolution satellite data offers the opportunity to regularly produce land cover (LC) maps. To this end, Transformer deep learning (DL) models have recently proven their effectiveness in accurately classifying long time series (TS) of satellite images. The continual generation of regularly updated LC maps can be used to analyze dynamic phenomena and extract multi-temporal information. However, several challenges need to be addressed. Our paper aims to study how the performance of a Transformer model changes when classifying TS of satellite images acquired in years later than those in the training set. In particular, the behavior of the attention in the Transformer model is analyzed to determine when the information provided by the initial training set needs to be updated to keep generating accurate LC products. Preliminary results show that: (i) the selection of the positional encoding strategy used in the Transformer has a significant impact on the classification accuracy obtained with multi-year TS, and (ii) the most affected classes are the seasonal ones.
536 _ _ |a 5111 - Domain-Specific Simulation & Data Life Cycle Labs (SDLs) and Research Groups (POF4-511)
|0 G:(DE-HGF)POF4-5111
|c POF4-511
|f POF IV
|x 0
536 _ _ |a RAISE - Research on AI- and Simulation-Based Engineering at Exascale (951733)
|0 G:(EU-Grant)951733
|c 951733
|f H2020-INFRAEDI-2019-1
|x 1
536 _ _ |a EUROCC-2 (DEA02266)
|0 G:(DE-Juel-1)DEA02266
|c DEA02266
|x 2
588 _ _ |a Dataset connected to CrossRef Conference
700 1 _ |a Ebert, Jan
|0 P:(DE-Juel1)187002
|b 1
|u fzj
700 1 _ |a Paris, Claudia
|0 P:(DE-HGF)0
|b 2
700 1 _ |a Riedel, Morris
|0 P:(DE-Juel1)132239
|b 3
|u fzj
700 1 _ |a Cavallaro, Gabriele
|0 P:(DE-Juel1)171343
|b 4
|u fzj
773 _ _ |a 10.1109/IGARSS52108.2023.10283284
856 4 _ |y OpenAccess
|u https://juser.fz-juelich.de/record/1017949/files/Enhancing_Training_Set_through_Multi_temporal_Attention_Analysis_in_Transformers_for_multi_year_Land_Cover_Mapping.pdf
856 4 _ |y OpenAccess
|x icon
|u https://juser.fz-juelich.de/record/1017949/files/Enhancing_Training_Set_through_Multi_temporal_Attention_Analysis_in_Transformers_for_multi_year_Land_Cover_Mapping.gif?subformat=icon
856 4 _ |y OpenAccess
|x icon-1440
|u https://juser.fz-juelich.de/record/1017949/files/Enhancing_Training_Set_through_Multi_temporal_Attention_Analysis_in_Transformers_for_multi_year_Land_Cover_Mapping.jpg?subformat=icon-1440
856 4 _ |y OpenAccess
|x icon-180
|u https://juser.fz-juelich.de/record/1017949/files/Enhancing_Training_Set_through_Multi_temporal_Attention_Analysis_in_Transformers_for_multi_year_Land_Cover_Mapping.jpg?subformat=icon-180
856 4 _ |y OpenAccess
|x icon-640
|u https://juser.fz-juelich.de/record/1017949/files/Enhancing_Training_Set_through_Multi_temporal_Attention_Analysis_in_Transformers_for_multi_year_Land_Cover_Mapping.jpg?subformat=icon-640
909 C O |o oai:juser.fz-juelich.de:1017949
|p openaire
|p open_access
|p driver
|p VDB
|p ec_fundedresources
|p dnbdelivery
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 0
|6 P:(DE-Juel1)178695
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 1
|6 P:(DE-Juel1)187002
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 3
|6 P:(DE-Juel1)132239
910 1 _ |a Forschungszentrum Jülich
|0 I:(DE-588b)5008462-8
|k FZJ
|b 4
|6 P:(DE-Juel1)171343
913 1 _ |a DE-HGF
|b Key Technologies
|l Engineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action
|1 G:(DE-HGF)POF4-510
|0 G:(DE-HGF)POF4-511
|3 G:(DE-HGF)POF4
|2 G:(DE-HGF)POF4-500
|4 G:(DE-HGF)POF
|v Enabling Computational- & Data-Intensive Science and Engineering
|9 G:(DE-HGF)POF4-5111
|x 0
914 1 _ |y 2023
915 _ _ |a OpenAccess
|0 StatID:(DE-HGF)0510
|2 StatID
920 1 _ |0 I:(DE-Juel1)JSC-20090406
|k JSC
|l Jülich Supercomputing Center
|x 0
980 _ _ |a contrib
980 _ _ |a VDB
980 _ _ |a UNRESTRICTED
980 _ _ |a I:(DE-Juel1)JSC-20090406
980 1 _ |a FullTexts


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21