Skip to content

Official implementation of NeurIPS 2024 paper "DiffusionPDE: Generative PDE-Solving Under Partial Observation"

License

Notifications You must be signed in to change notification settings

jhhuangchloe/DiffusionPDE

Repository files navigation

DiffusionPDE: Generative PDE-Solving Under Partial Observation | NeurIPS 2024

Official PyTorch implementation.
DiffusionPDE: Generative PDE-Solving Under Partial Observation
Jiahe Huang, Guandao Yang, Zichen Wang, Jeong Joon Park
University of Michigan
Stanford University
DiffusionPDE

Requirements

Python libraries: See environment.yml for library dependencies. The conda environment can be set up using these commands:

conda env create -f environment.yml -n DiffusionPDE
conda activate DiffusionPDE

Data Generation

All training datasets can be downloaded from here and all test datasets can be downloaded from here. Unzip the training.zip folder and the testing.zip folder in the data/ directory. You can also directly access data files here.

Datasets of Darcy Flow, Poisson equation, and Helmholtz equation are of the shape [N, X, Y], where N is the number of instances, and X, Y are spatial resolutions. Datasets of non-bounded and bounded Navier-Stokes equation are of the shape [N, X, Y, T] where T is the number of time steps. Datasets of Burgers' equation are of the shape [N, X, T].

Data generation codes for bounded Navier Stokes equation are derived from 2D Fliud Simulator, and codes for other PDEs are available in the dataset_generation folder. Specifically, we implemented our data generation over FNO and modified the code to introduce more finite difference methods for the Poisson equation and the Helmholtz equation.

Train Diffusion Models

All pre-trained models can be downloaded from here. Unzip the pretrained-models.zip in the root directory.

Our training script is derived from EDM. To train a new diffusion model on the joint distribution, use, e.g.,

# Prepare the .npy files for training. 
# Raw data in the datasets should be scaled to (-1, 1).
python3 merge_data.py # Darcy Flow

# Train the diffusion model.
torchrun --standalone --nproc_per_node=3 train.py --outdir=pretrained-darcy-new --data=/data/Darcy-merged/ --cond=0 --arch=ddpmpp --batch=60 --batch-gpu=20 --tick=10 --snap=50 --dump=100 --duration=20 --ema=0.05

Solve Forward Problem

To solve the forward problem with sparse observation on the coefficient (or initial state) space, use, e.g.,

python3 generate_pde.py --config configs/darcy-forward.yaml

Solve Inverse Problem

To solve the inverse problem with sparse observation on the solution (or final state) space, use, e.g.,

python3 generate_pde.py --config configs/darcy-inverse.yaml

Recover Both Spaces With Observation On Both Sides

To simultaneously solve coefficient (initial state) space and solution (final state) space with sparse observations on both sides, use, e.g.,

python3 generate_pde.py --config configs/darcy.yaml

Solve Solution Over Time

To recover the solution throughout a time interval with sparse sensors, use, e.g.,

python3 generate_pde.py --config configs/burgers.yaml

License

DiffusionPDE: Generative PDE-Solving Under Partial Observation by Jiahe Huang, Guandao Yang, Zichen Wang, Jeong Joon Park is licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International.

dnnlib, torch_utils, training folders, and train.py are derived from the codes by Tero Karras, Miika Aittala, Timo Aila, and Samuli Laine. The codes were originally shared under the Attribution-NonCommercial-ShareAlike 4.0 International License.

Data generation codes for Darcy Flow, Burgers' equation, and non-bounded Navier-Stokes equation are derived from the codes by Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, and Anima Anandkumar. The codes were originally shared under the MIT license.

Citation

@misc{huang2024diffusionpdegenerativepdesolvingpartial,
      title={DiffusionPDE: Generative PDE-Solving Under Partial Observation}, 
      author={Jiahe Huang and Guandao Yang and Zichen Wang and Jeong Joon Park},
      year={2024},
      eprint={2406.17763},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
      url={https://arxiv.org/abs/2406.17763}, 
}

About

Official implementation of NeurIPS 2024 paper "DiffusionPDE: Generative PDE-Solving Under Partial Observation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published