001035267 001__ 1035267
001035267 005__ 20250203103430.0
001035267 0247_ $$2doi$$a10.5194/egusphere-egu24-18331
001035267 0247_ $$2datacite_doi$$a10.34734/FZJ-2025-00343
001035267 037__ $$aFZJ-2025-00343
001035267 1001_ $$0P:(DE-Juel1)180790$$aLangguth, Michael$$b0$$eCorresponding author
001035267 1112_ $$aEuropean Geosciences Union General Assembly 2024$$cVienna$$d2024-04-14 - 2024-04-19$$gEGU 2024$$wAustria
001035267 245__ $$aDownscaling with the foundation model AtmoRep
001035267 260__ $$c2024
001035267 3367_ $$033$$2EndNote$$aConference Paper
001035267 3367_ $$2BibTeX$$aINPROCEEDINGS
001035267 3367_ $$2DRIVER$$aconferenceObject
001035267 3367_ $$2ORCID$$aCONFERENCE_POSTER
001035267 3367_ $$2DataCite$$aOutput Types/Conference Poster
001035267 3367_ $$0PUB:(DE-HGF)24$$2PUB:(DE-HGF)$$aPoster$$bposter$$mposter$$s1736488169_12429$$xAfter Call
001035267 520__ $$aIn recent years, deep neural networks (DNN) to enhance the resolution of meteorological data, known as statistical downscaling, have surpassed classical statistical methods that have been developed previously with respect to several validation metrics. The prevailing approach for DNN downscaling is to train deep learning models in an end-to-end manner. However, foundation models trained on very large datasets in a self-supervised way have proven to provide new SOTA results for various applications in natural language processing and computer vision. To investigate the benefit of foundation models in Earth Science applications, we deploy the large-scale representation model for atmospheric dynamics AtmoRep (Lessig et al., 2023) for statistical downscaling of the 2m temperature over Central Europe. AtmoRep has been trained on almost 40 years of ERA5 data from 1979 to 2017 and has shown promising skill in several intrinsic and downstream applications. By extending AtmoRep’s encoder-decoder with a tail network for downscaling, we super-resolve the coarse-grained 2 m temperature field from ERA5-data (Δx = 25 km) to attain the high spatial resolution (Δx = 6 km) of the COSMO REA6 dataset. Different coupling approaches between the core and tail network (e.g. with and without fine-tuning the core model) are tested and analyzed in terms of accuracy and computational efficiency. Preliminary results show that downscaling with a task-specific extension of the foundation model AtmoRep can improve the downscaled product in terms of standard evaluation metrics such as the RMSE compared to a task-specific deep learning model. However, deficiencies in the spatial variability of the downscaled product are also revealed, highlighting the need for future work to focus especially on target data that inhibit a high degree of spatial variability and intrinsic uncertainty such as precipitation.
001035267 536__ $$0G:(DE-HGF)POF4-5111$$a5111 - Domain-Specific Simulation & Data Life Cycle Labs (SDLs) and Research Groups (POF4-511)$$cPOF4-511$$fPOF IV$$x0
001035267 536__ $$0G:(EU-Grant)955513$$aMAELSTROM - MAchinE Learning for Scalable meTeoROlogy and cliMate (955513)$$c955513$$fH2020-JTI-EuroHPC-2019-1$$x1
001035267 536__ $$0G:(BMBF)16HPC029$$aVerbundprojekt: MAELSTROM - Skalierbarkeit von Anwendungen des Maschinellen Lernens in den Bereichen Wetter und Klimawissenschaften für das zukünftige Supercomputing (16HPC029)$$c16HPC029$$x2
001035267 536__ $$0G:(DE-Juel-1)ESDE$$aEarth System Data Exploration (ESDE)$$cESDE$$x3
001035267 588__ $$aDataset connected to CrossRef
001035267 7001_ $$0P:(DE-HGF)0$$aLessig, Christian$$b1
001035267 7001_ $$0P:(DE-Juel1)6952$$aSchultz, Martin$$b2
001035267 7001_ $$0P:(DE-HGF)0$$aLuise, Ilaria$$b3
001035267 773__ $$a10.5194/egusphere-egu24-18331
001035267 8564_ $$uhttps://juser.fz-juelich.de/record/1035267/files/Downscaling_with%20_AtmoRep_Langguth.pdf$$yOpenAccess
001035267 909CO $$ooai:juser.fz-juelich.de:1035267$$pec_fundedresources$$pdriver$$pVDB$$popen_access$$popenaire
001035267 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)180790$$aForschungszentrum Jülich$$b0$$kFZJ
001035267 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)6952$$aForschungszentrum Jülich$$b2$$kFZJ
001035267 9131_ $$0G:(DE-HGF)POF4-511$$1G:(DE-HGF)POF4-510$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5111$$aDE-HGF$$bKey Technologies$$lEngineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action$$vEnabling Computational- & Data-Intensive Science and Engineering$$x0
001035267 9141_ $$y2024
001035267 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
001035267 920__ $$lyes
001035267 9201_ $$0I:(DE-Juel1)JSC-20090406$$kJSC$$lJülich Supercomputing Center$$x0
001035267 980__ $$aposter
001035267 980__ $$aVDB
001035267 980__ $$aUNRESTRICTED
001035267 980__ $$aI:(DE-Juel1)JSC-20090406
001035267 9801_ $$aFullTexts