| paper | pretrained model |
This repo contains an official PyTorch implementation for the paper "Maximum Likelihood Training of Implicit Nonlinear Diffusion Model" in NeurIPS 2022.
Dongjun Kim *, Byeonghu Na *, Se Jung Kwon, Dongsoo Lee, Wanmo Kang, and Il-Chul Moon
* Equal contribution
This paper introduces Implicit Nonlinear Diffusion Model (INDM), that learns the nonlinear diffusion process by combining a normalizing flow and a diffusion process.
This code was tested with CUDA 11.1 and Python 3.8.
pip install -r requirements.txt
We release our checkpoints here.
Download stats files and save it to ./assets/stats/
.
- For CIFAR-10, we use the stats file comes from yang-song/score_sde_pytorch.
- For CelebA, we provide the stats file
celeba_stats.npz
.
- When you run a deep model, change the value of
model.num_res_blocks
from 4 to 8 in the config file. (or add--config.model.num_res_blocks 8
in your script.)
- Training
python main.py --mode train --config configs/ve/CIFAR10/indm.py --workdir <work_dir>
- Evaluation (NLL/NELBO evaluation and sampling)
python main.py --mode eval --config configs/ve/CIFAR10/indm.py --workdir <work_dir> --config.sampling.pc_denoise=True --config.sampling.pc_denoise_time -1 --config.sampling.begin_snr 0.14 --config.sampling.end_snr 0.14 --config.eval.data_mean=True
- Training
python main.py --mode train --config configs/vp/CIFAR10/indm_fid.py --workdir <work_dir>
- Evaluation (NLL/NELBO evaluation and sampling)
python main.py --mode eval --config configs/vp/CIFAR10/indm_fid.py --workdir <work_dir> --config.sampling.temperature 1.05
- Training
python main.py --mode train --config configs/vp/CIFAR10/indm_nll.py --workdir <work_dir>
- Evaluation (NLL/NELBO evaluation and sampling)
python main.py --mode eval --config configs/vp/CIFAR10/indm_nll.py --workdir <work_dir>
- Training
python main.py --mode train --config configs/ve/CELEBA/indm.py --workdir <work_dir>
- Evaluation (NLL/NELBO evaluation and sampling)
python main.py --mode eval --config configs/ve/CELEBA/indm.py --workdir <work_dir>
- Training
python main.py --mode train --config configs/vp/CELEBA/indm_fid.py --workdir <work_dir>
- Evaluation (NLL/NELBO evaluation and sampling)
python main.py --mode eval --config configs/vp/CELEBA/indm_fid.py --workdir <work_dir>
- Training
python main.py --mode train --config configs/vp/CELEBA/indm_nll.py --workdir <work_dir>
- Evaluation (NLL/NELBO evaluation and sampling)
python main.py --mode eval --config configs/vp/CELEBA/indm_nll.py --workdir <work_dir>
This work is heavily built upon the code from
- Song, Yang, et al. "Score-Based Generative Modeling through Stochastic Differential Equations." International Conference on Learning Representations (ICLR). 2021.
- Song, Yang, et al. "Maximum likelihood training of score-based diffusion models." Advances in Neural Information Processing Systems (NeurIPS). 2021.
- Xuezhe Ma, et al. "Decoupling Global and Local Representations via Invertible Generative Flows." International Conference on Learning Representations (ICLR). 2021.
If you find the code useful for your research, please consider citing
@inproceedings{kim2022maximum,
title={Maximum Likelihood Training of Implicit Nonlinear Diffusion Model},
author={Dongjun Kim and Byeonghu Na and Se Jung Kwon and Dongsoo Lee and Wanmo Kang and Il-Chul Moon},
booktitle={Advances in Neural Information Processing Systems},
year={2022}
}