Skip to yearly menu bar Skip to main content


Poster

WeatherGFM: Learning a Weather Generalist Foundation Model via In-context Learning

Xiangyu Zhao · Zhiwang Zhou · Wenlong Zhang · Yihao Liu · Xiangyu Chen · Junchao Gong · Hao Chen · Ben Fei · Shiqi Chen · Wanli Ouyang · Xiao-Ming Wu · LEI BAI

Hall 3 + Hall 2B #605
[ ] [ Project Page ]
Thu 24 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

The Earth's weather system involves intricate weather data modalities and diverse weather understanding tasks, which hold significant value to human life. Existing data-driven models focus on single weather understanding tasks (e.g., weather forecasting). While these models have achieved promising results, they fail to tackle various complex tasks within a single and unified model. Moreover, the paradigm that relies on limited real observations for a single scenario hinders the model's performance upper bound.Inspired by the in-context learning paradigm from visual foundation models and large language models, in this paper, we introduce the first generalist weather generalist foundation model (WeatherGFM) to address weather understanding tasks in a unified manner. Specifically, we first unify the representation and definition for diverse weather understanding tasks.Subsequently, we design weather prompt formats to handle different weather data modalities, including single, multiple, and temporal modalities. Finally, we adopt a visual prompting question-answering paradigm for the training of unified weather understanding tasks. Extensive experiments indicate that our WeatherGFM can effectively handle up to 12 weather understanding tasks, including weather forecasting, super-resolution, weather image translation, and post-processing. Our method also showcases generalization ability on unseen tasks. The source code is available at github.com/xiangyu-mm/WeatherGFM.

Live content is unavailable. Log in and register to view live content