We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

cs

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: Analyzing and Exploring Training Recipes for Large-Scale Transformer-Based Weather Prediction

Abstract: The rapid rise of deep learning (DL) in numerical weather prediction (NWP) has led to a proliferation of models which forecast atmospheric variables with comparable or superior skill than traditional physics-based NWP. However, among these leading DL models, there is a wide variance in both the training settings and architecture used. Further, the lack of thorough ablation studies makes it hard to discern which components are most critical to success. In this work, we show that it is possible to attain high forecast skill even with relatively off-the-shelf architectures, simple training procedures, and moderate compute budgets. Specifically, we train a minimally modified SwinV2 transformer on ERA5 data, and find that it attains superior forecast skill when compared against IFS. We present some ablations on key aspects of the training pipeline, exploring different loss functions, model sizes and depths, and multi-step fine-tuning to investigate their effect. We also examine the model performance with metrics beyond the typical ACC and RMSE, and investigate how the performance scales with model size.
Comments: 9 pages, 6 figures
Subjects: Machine Learning (cs.LG)
MSC classes: 68T07, 86A10
ACM classes: J.2; I.2.6
Journal reference: 23rd Conference on Artificial Intelligence for Environmental Science. Jan 2024. Abstract #437874
Cite as: arXiv:2404.19630 [cs.LG]
  (or arXiv:2404.19630v1 [cs.LG] for this version)

Submission history

From: Jared Willard [view email]
[v1] Tue, 30 Apr 2024 15:30:14 GMT (3709kb,D)

Link back to: arXiv, form interface, contact.