001 | 1035267 | ||
005 | 20250203103430.0 | ||
024 | 7 | _ | |a 10.5194/egusphere-egu24-18331 |2 doi |
024 | 7 | _ | |a 10.34734/FZJ-2025-00343 |2 datacite_doi |
037 | _ | _ | |a FZJ-2025-00343 |
100 | 1 | _ | |a Langguth, Michael |0 P:(DE-Juel1)180790 |b 0 |e Corresponding author |
111 | 2 | _ | |a European Geosciences Union General Assembly 2024 |g EGU 2024 |c Vienna |d 2024-04-14 - 2024-04-19 |w Austria |
245 | _ | _ | |a Downscaling with the foundation model AtmoRep |
260 | _ | _ | |c 2024 |
336 | 7 | _ | |a Conference Paper |0 33 |2 EndNote |
336 | 7 | _ | |a INPROCEEDINGS |2 BibTeX |
336 | 7 | _ | |a conferenceObject |2 DRIVER |
336 | 7 | _ | |a CONFERENCE_POSTER |2 ORCID |
336 | 7 | _ | |a Output Types/Conference Poster |2 DataCite |
336 | 7 | _ | |a Poster |b poster |m poster |0 PUB:(DE-HGF)24 |s 1736488169_12429 |2 PUB:(DE-HGF) |x After Call |
520 | _ | _ | |a In recent years, deep neural networks (DNN) to enhance the resolution of meteorological data, known as statistical downscaling, have surpassed classical statistical methods that have been developed previously with respect to several validation metrics. The prevailing approach for DNN downscaling is to train deep learning models in an end-to-end manner. However, foundation models trained on very large datasets in a self-supervised way have proven to provide new SOTA results for various applications in natural language processing and computer vision. To investigate the benefit of foundation models in Earth Science applications, we deploy the large-scale representation model for atmospheric dynamics AtmoRep (Lessig et al., 2023) for statistical downscaling of the 2m temperature over Central Europe. AtmoRep has been trained on almost 40 years of ERA5 data from 1979 to 2017 and has shown promising skill in several intrinsic and downstream applications. By extending AtmoRep’s encoder-decoder with a tail network for downscaling, we super-resolve the coarse-grained 2 m temperature field from ERA5-data (Δx = 25 km) to attain the high spatial resolution (Δx = 6 km) of the COSMO REA6 dataset. Different coupling approaches between the core and tail network (e.g. with and without fine-tuning the core model) are tested and analyzed in terms of accuracy and computational efficiency. Preliminary results show that downscaling with a task-specific extension of the foundation model AtmoRep can improve the downscaled product in terms of standard evaluation metrics such as the RMSE compared to a task-specific deep learning model. However, deficiencies in the spatial variability of the downscaled product are also revealed, highlighting the need for future work to focus especially on target data that inhibit a high degree of spatial variability and intrinsic uncertainty such as precipitation. |
536 | _ | _ | |a 5111 - Domain-Specific Simulation & Data Life Cycle Labs (SDLs) and Research Groups (POF4-511) |0 G:(DE-HGF)POF4-5111 |c POF4-511 |f POF IV |x 0 |
536 | _ | _ | |a MAELSTROM - MAchinE Learning for Scalable meTeoROlogy and cliMate (955513) |0 G:(EU-Grant)955513 |c 955513 |f H2020-JTI-EuroHPC-2019-1 |x 1 |
536 | _ | _ | |a Verbundprojekt: MAELSTROM - Skalierbarkeit von Anwendungen des Maschinellen Lernens in den Bereichen Wetter und Klimawissenschaften für das zukünftige Supercomputing (16HPC029) |0 G:(BMBF)16HPC029 |c 16HPC029 |x 2 |
536 | _ | _ | |a Earth System Data Exploration (ESDE) |0 G:(DE-Juel-1)ESDE |c ESDE |x 3 |
588 | _ | _ | |a Dataset connected to CrossRef |
700 | 1 | _ | |a Lessig, Christian |0 P:(DE-HGF)0 |b 1 |
700 | 1 | _ | |a Schultz, Martin |0 P:(DE-Juel1)6952 |b 2 |
700 | 1 | _ | |a Luise, Ilaria |0 P:(DE-HGF)0 |b 3 |
773 | _ | _ | |a 10.5194/egusphere-egu24-18331 |
856 | 4 | _ | |u https://juser.fz-juelich.de/record/1035267/files/Downscaling_with%20_AtmoRep_Langguth.pdf |y OpenAccess |
909 | C | O | |o oai:juser.fz-juelich.de:1035267 |p openaire |p open_access |p VDB |p driver |p ec_fundedresources |
910 | 1 | _ | |a Forschungszentrum Jülich |0 I:(DE-588b)5008462-8 |k FZJ |b 0 |6 P:(DE-Juel1)180790 |
910 | 1 | _ | |a Forschungszentrum Jülich |0 I:(DE-588b)5008462-8 |k FZJ |b 2 |6 P:(DE-Juel1)6952 |
913 | 1 | _ | |a DE-HGF |b Key Technologies |l Engineering Digital Futures – Supercomputing, Data Management and Information Security for Knowledge and Action |1 G:(DE-HGF)POF4-510 |0 G:(DE-HGF)POF4-511 |3 G:(DE-HGF)POF4 |2 G:(DE-HGF)POF4-500 |4 G:(DE-HGF)POF |v Enabling Computational- & Data-Intensive Science and Engineering |9 G:(DE-HGF)POF4-5111 |x 0 |
914 | 1 | _ | |y 2024 |
915 | _ | _ | |a OpenAccess |0 StatID:(DE-HGF)0510 |2 StatID |
920 | _ | _ | |l yes |
920 | 1 | _ | |0 I:(DE-Juel1)JSC-20090406 |k JSC |l Jülich Supercomputing Center |x 0 |
980 | _ | _ | |a poster |
980 | _ | _ | |a VDB |
980 | _ | _ | |a UNRESTRICTED |
980 | _ | _ | |a I:(DE-Juel1)JSC-20090406 |
980 | 1 | _ | |a FullTexts |
Library | Collection | CLSMajor | CLSMinor | Language | Author |
---|