Skip to content

Way-Yuhao/WassDiff

Repository files navigation

Downscaling Extreme Precipitation with Wasserstein Regularized Diffusion

PyTorch Lightning Config: Hydra Template
Preprint Conference Google Colab HuggingFace

This is the official implementation of the paper Downscaling Extreme Precipitation with Wasserstein Regularized Diffusion. To quickly experiment with the proposed diffusion downscaling model (called WassDiff), we recommend trying this Colab demo.

Demo screenshot

Install dependencies

To create a conda environment with all the required packages, run:

conda env create -f environment.yml

Compile datasets

We present two options to obtain the dataset required for this project.

Option A: mini dataset

If you would like to quickly test-run the pre-trained model, please download a mini-dataset on Hugging Face, which contains all required historical inputs from 1990 – 2000 on CONUS. Keep in mind that there are no ground-truth radar measurements for this time period -- supervised training is not possible.

Option B: full dataset

If you would like to train the model from scratch, then here are the instructions on how to obtain the required training and validation data:

CPC Unified Gauge-Based Analysis of Daily Precipitation: navigate to NOAA, download .nc under precipitation. Choose appropriate years. The gauge density files are stored separately on NOAA's FTP server here. Download .gz files from here.

ERA5 and MRMS: download instructions can be found on this repository.

Setup dataset configs

Once all the data is downloaded, navigate to configs/local/default.yaml and update these entries according to where the data are stored on your local machine:

# @package _global_
# This file is not tracked by git and is specific to YOUR MACHINE
# note: do not append / at the end of the path
local:
  data_dir: # PATH TO ROOT DATA
  log_dir: # PATH TO WANDB LOG
  eval_set_root_dir: # PATH TO EVAL OUTPUT
  specified_eval_root_dir: # PATH TO MATPLOTLIB OUTPUT
  model_root_dir: # PATH TO MODEL WEIGHTS

Training

To train the proposed model (WassDiff), run

python src/train.py trainer=gpu model=wassdiff experiment=det_val_sampler

To train the baseline diffusion model (SBDM, to be trained without Wasserstein Distance Regularization), run

python src/train.py trainer=gpu model=ablation_wdr experiment=det_val_sampler

Evaluation

Obtain metrics on full test set

Download model weights from HuggingFace. Quantitative evaluation can be done by running

python ./src/eval.py trainer=gpu model=wassdiff experiment=eval_val_set ckpt_path=PATH_TO_MODEL_WEIGHTS name=NAME_OF_DIRECTORY

Set ckpt_path to the path of the model weights you want to evaluate, and name to the name of the directory where the evaluation results will be stored. Note that the evaluation results will be stored in eval_set_root_dir specified in configs/local/default.yaml.

Downscale any targets

Alternatively, to generate sample(s) (single or ensemble) on a specified region and date (requires input data to be downloaded), run

python ./src/eval.py model=wassdiff experiment=specified_eval
 ckpt_path=PATH_TO_MODEL_WEIGHTS name=NAME_OF_DIRECTORY

Set ckpt_path to the path of the model weights you want to evaluate, and name to the name of the directory where the evaluation results will be stored. Note that the evaluation results will be stored in specified_eval_root_dir specified in configs/local/default.yaml.

To adjust ensemble size, append model.num_samples=ENSEMBLE_SIZE to the command above. You may modify the lon, lat, and date parameters in configs/experiment/specified_eval.yaml to specify the region and date of interest.

Continental-scale generation

You can optionally use tiled_diffusion to generate larger images (such as for the entire CONUS region). To do so, use model=wassdiff_tiled in the command below:

python ./src/eval.py model=wassdiff experiment=specified_eval model=wassdiff_tiled ckpt_path=PATH_TO_MODEL_WEIGHTS name=NAME_OF_DIRECTORY

About

A generative precipitation downscaling framework that models climate extremes.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors