001041779 001__ 1041779
001041779 005__ 20250610131452.0
001041779 0247_ $$2doi$$a10.3389/frsen.2025.1555887
001041779 0247_ $$2datacite_doi$$a10.34734/FZJ-2025-02419
001041779 0247_ $$2WOS$$aWOS:001488134000001
001041779 037__ $$aFZJ-2025-02419
001041779 082__ $$a600
001041779 1001_ $$0P:(DE-Juel1)186635$$aPatnala, Ankit$$b0$$eCorresponding author$$ufzj
001041779 245__ $$aBERT Bi-modal self-supervised learning for crop classification using Sentinel-2 and Planetscope
001041779 260__ $$aLausanne$$bFrontiers Media$$c2025
001041779 3367_ $$2DRIVER$$aarticle
001041779 3367_ $$2DataCite$$aOutput Types/Journal article
001041779 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1747026523_12850
001041779 3367_ $$2BibTeX$$aARTICLE
001041779 3367_ $$2ORCID$$aJOURNAL_ARTICLE
001041779 3367_ $$00$$2EndNote$$aJournal Article
001041779 520__ $$aCrop identification and monitoring of crop dynamics are essential for agricultural planning, environmental monitoring, and ensuring food security. Recent advancements in remote sensing technology and state-of-the-art machine learning have enabled large-scale automated crop classification. However, these methods rely on labeled training data, which requires skilled human annotators or extensive field campaigns, making the process expensive and time-consuming. Self-supervised learning techniques have demonstrated promising results in leveraging large unlabeled datasets across domains. Yet, self-supervised representation learning for crop classification from remote sensing time series remains under-explored due to challenges in curating suitable pretext tasks. While bimodal self-supervised approaches combining data from Sentinel-2 and Planetscope sensors have facilitated pre-training, existing methods primarily exploit the distinct spectral properties of these complementary data sources. In this work, we propose novel self-supervised pre-training strategies inspired from BERT that leverage both the spectral and temporal resolution of Sentinel-2 and Planetscope imagery. We carry out extensive experiments comparing our approach to existing baseline setups across nine test cases, in which our method outperforms the baselines in eight instances. This pre-training thus offers an effective representation of crops for tasks such as crop classification.
001041779 536__ $$0G:(DE-HGF)POF4-5111$$a5111 - Domain-Specific Simulation & Data Life Cycle Labs (SDLs) and Research Groups (POF4-511)$$cPOF4-511$$fPOF IV$$x0
001041779 536__ $$0G:(DE-Juel-1)ESDE$$aEarth System Data Exploration (ESDE)$$cESDE$$x1
001041779 536__ $$0G:(DE-Juel1)kiste_20200501$$aAI Strategy for Earth system data (kiste_20200501)$$ckiste_20200501$$fAI Strategy for Earth system data$$x2
001041779 588__ $$aDataset connected to CrossRef, Journals: juser.fz-juelich.de
001041779 7001_ $$0P:(DE-Juel1)6952$$aSchultz, Martin$$b1$$ufzj
001041779 7001_ $$0P:(DE-HGF)0$$aGall, Juergen$$b2
001041779 773__ $$0PERI:(DE-600)3091289-1$$a10.3389/frsen.2025.1555887$$gVol. 6, p. 1555887$$p1555887$$tFrontiers in remote sensing$$v6$$x2673-6187$$y2025
001041779 8564_ $$uhttps://juser.fz-juelich.de/record/1041779/files/frsen-1-1555887.pdf$$yOpenAccess
001041779 8767_ $$d2025-05-12$$eAPC$$jDeposit$$zCHF 2167,5
001041779 909CO $$ooai:juser.fz-juelich.de:1041779$$pdnbdelivery$$popenCost$$pVDB$$pdriver$$pOpenAPC$$popen_access$$popenaire
001041779 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)186635$$aForschungszentrum Jülich$$b0$$kFZJ
001041779 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)6952$$aForschungszentrum Jülich$$b1$$kFZJ
001041779 9131_ $$0G:(DE-HGF)POF4-511$$1G:(DE-HGF)POF4-510$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5111$$aDE-HGF$$bKey Technologies$$lEngineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action$$vEnabling Computational- & Data-Intensive Science and Engineering$$x0
001041779 9141_ $$y2025
001041779 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS$$d2024-12-13
001041779 915__ $$0LIC:(DE-HGF)CCBY4$$2HGFVOC$$aCreative Commons Attribution CC BY 4.0
001041779 915__ $$0StatID:(DE-HGF)0501$$2StatID$$aDBCoverage$$bDOAJ Seal$$d2024-01-17T19:06:03Z
001041779 915__ $$0StatID:(DE-HGF)0112$$2StatID$$aWoS$$bEmerging Sources Citation Index$$d2024-12-13
001041779 915__ $$0StatID:(DE-HGF)0500$$2StatID$$aDBCoverage$$bDOAJ$$d2024-01-17T19:06:03Z
001041779 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection$$d2024-12-13
001041779 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
001041779 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bDOAJ : Anonymous peer review$$d2024-01-17T19:06:03Z
001041779 915__ $$0StatID:(DE-HGF)0561$$2StatID$$aArticle Processing Charges$$d2024-12-13
001041779 915__ $$0StatID:(DE-HGF)0700$$2StatID$$aFees$$d2024-12-13
001041779 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bClarivate Analytics Master Journal List$$d2024-12-13
001041779 915pc $$0PC:(DE-HGF)0000$$2APC$$aAPC keys set
001041779 915pc $$0PC:(DE-HGF)0001$$2APC$$aLocal Funding
001041779 915pc $$0PC:(DE-HGF)0002$$2APC$$aDFG OA Publikationskosten
001041779 915pc $$0PC:(DE-HGF)0003$$2APC$$aDOAJ Journal
001041779 920__ $$lyes
001041779 9201_ $$0I:(DE-Juel1)JSC-20090406$$kJSC$$lJülich Supercomputing Center$$x0
001041779 9801_ $$aFullTexts
001041779 980__ $$ajournal
001041779 980__ $$aVDB
001041779 980__ $$aUNRESTRICTED
001041779 980__ $$aI:(DE-Juel1)JSC-20090406
001041779 980__ $$aAPC