Implementation of Denoising Diffusion Probabilistic Model in MindSpore. The implementation refers to lucidrains's denoising-diffusion-pytorch.
Training 50k steps with EMA.
pip install denoising-diffusion-mindspore
# Github repo(oversea)
pip install git+https://github.com/lvyufeng/denoising-diffusion-mindspore
# From OpenI repo(in China)
pip install git+https://openi.pcl.ac.cn/lvyufeng/denoising-diffusion-mindspore
from ddm import Unet, GaussianDiffusion, value_and_grad
from ddm.ops import randn
model = Unet(
dim = 64,
dim_mults = (1, 2, 4, 8)
)
diffusion = GaussianDiffusion(
model,
image_size = 128,
timesteps = 1000, # number of steps
loss_type = 'l1' # L1 or L2
)
training_images = randn((1, 3, 128, 128)) # images are normalized from 0 to 1
grad_fn = value_and_grad(diffusion, None, diffusion.trainable_params())
loss, grads = grad_fn(training_images)
# after a lot of training
sampled_images = diffusion.sample(batch_size = 1)
print(sampled_images.shape) # (4, 3, 128, 128)
Or, if you simply want to pass in a folder name and the desired image dimensions, you can use the Trainer
class to easily train a model.
from download import download
from ddm import Unet, GaussianDiffusion, Trainer
url = 'https://www.robots.ox.ac.uk/~vgg/data/flowers/102/102flowers.tgz'
path = download(url, './102flowers', 'tar.gz')
model = Unet(
dim = 64,
dim_mults = (1, 2, 4, 8)
)
diffusion = GaussianDiffusion(
model,
image_size = 64,
timesteps = 10, # number of steps
sampling_timesteps = 5, # number of sampling timesteps (using ddim for faster inference [see citation for ddim paper])
loss_type = 'l1' # L1 or L2
)
trainer = Trainer(
diffusion,
path,
train_batch_size = 1,
train_lr = 8e-5,
train_num_steps = 1000, # total training steps
gradient_accumulate_every = 2, # gradient accumulation steps
ema_decay = 0.995, # exponential moving average decay
amp_level = 'O1', # turn on mixed precision
)
trainer.train()
amp_level
ofTrainer
will automaticlly set toO1
on Ascend.
@inproceedings{NEURIPS2020_4c5bcfec,
author = {Ho, Jonathan and Jain, Ajay and Abbeel, Pieter},
booktitle = {Advances in Neural Information Processing Systems},
editor = {H. Larochelle and M. Ranzato and R. Hadsell and M.F. Balcan and H. Lin},
pages = {6840--6851},
publisher = {Curran Associates, Inc.},
title = {Denoising Diffusion Probabilistic Models},
url = {https://proceedings.neurips.cc/paper/2020/file/4c5bcfec8584af0d967f1ab10179ca4b-Paper.pdf},
volume = {33},
year = {2020}
}