This repository provides the source codes for the experiments in our paper at here.
If you use this code, please cite
@article{ding2023efficient,
title={Efficient subsampling of realistic images from GANs conditional on a class or a continuous variable},
author={Ding, Xin and Wang, Yongwei and Wang, Z Jane and Welch, William J},
journal={Neurocomputing},
volume={517},
pages={188--200},
year={2023},
publisher={Elsevier}
}
├── CIFAR-10
│ ├── cDR-RS
│ ├── DRE-F-SP+RS
│ ├── DRS
│ ├── Collab
│ ├── DDLS
│ ├── GOLD
│ ├── GANs
│ └── eval_and_gan_ckpts
│
├── CIFAR-100
│ ├── cDR-RS
│ ├── DRE-F-SP+RS
│ ├── DRS
│ ├── Collab
│ ├── DDLS
│ ├── GANs
│ └── eval_and_gan_ckpts
│
├── ImageNet-100
│ ├── cDR-RS
│ ├── DRE-F-SP+RS
│ ├── DRS
│ ├── Collab
│ ├── DDLS
│ ├── GANs
│ └── eval_and_gan_ckpts
│
├── UTKFace
│ ├── cDR-RS
│ ├── DRE-F-SP+RS
│ ├── DRS
│ ├── Collab
│ ├── DDLS
│ └── eval_and_gan_ckpts
│
└── RC-49
├── cDR-RS
├── DRS
├── Collab
└── eval_and_gan_ckpts
The overall workflow of cDR-RS.
Effectiveness and Efficiency Comparison on ImageNet-100 (Two NVIDIA V100)
Effectiveness and Efficiency Comparison on UTKFace (One NVIDIA V100)
Item | Version |
---|---|
Python | 3.9.5 |
argparse | 1.1 |
CUDA | 11.4 |
cuDNN | 8.2 |
numpy | 1.14 |
torch | 1.9.0 |
torchvision | 0.10.0 |
Pillow | 8.2.0 |
matplotlib | 3.4.2 |
tqdm | 4.61.1 |
h5py | 3.3.0 |
Matlab | 2020a |
The unprocessed ImageNet-100 dataset (imagenet.tar.gz
) can be download from here.
After unzipping imagenet.tar.gz
, put image
in ./datasets/ImageNet-100
. Then run python make_dataset.py
in ./datasets/ImageNet-100
. Finally, we will get the h5 file of the processed ImageNet-100 dataset named ImageNet_128x128_100Class.h5
.
Please refer to https://github.com/UBCDingXin/improved_CcGAN for the download link of RC-49 and the preprocessed UTKFace datasets. Download RC-49 (64x64) and UTKFace (64x64) h5 files and put them in ./datasets/RC-49
and ./datasets/UTKFace
, respectively.
Remember to set correct root path, data path, and checkpoint path. Please also remember to download necessary checkpoints for each experiment.
Download eval_and_gan_ckpts.zip. Unzip eval_and_gan_ckpts.zip
to get eval_and_gan_ckpts
, and move eval_and_gan_ckpts
to ./CIFAR-10
. This folder includes the checkpoint of Inception-V3 for evaluation.
- Train three GANs: ACGAN, SNGAN, and BigGAN. Their checkpoints used in our experiment are also provided in
eval_and_gan_ckpts
. Thus, to reproduce our results, the training of these GANs is actually not necessary.
ACGAN: Run./CIFAR-10/GANs/ACGAN/scripts/run_train.sh
SNGAN: Run./CIFAR-10/GANs/SNGAN/scripts/run_train.sh
BigGAN: Run./CIFAR-10/GANs/BigGAN/scripts/launch_cifar10_ema.sh
- Implement each sampling method. Run
.sh
script(s) in the folder of each method.
cDR-RS and DRE-F-SP+RS: Run./scripts/run_exp_acgan.sh
for ACGAN. Run./scripts/run_exp_sngan.sh
for SNGAN. Run./scripts/run_exp_biggan.sh
for BigGAN.
DRS, DDLS, and Collab: Run./scripts/run_sngan.sh
for SNGAN. Run./scripts/run_biggan.sh
for BigGAN.
GOLD: Run./scripts/run_acgan.sh
for ACGAN.
Download eval_and_gan_ckpts.zip. Unzip eval_and_gan_ckpts.zip
to get eval_and_gan_ckpts
, and move eval_and_gan_ckpts
to ./CIFAR-100
. This folder includes the checkpoint of Inception-V3 for evaluation.
- Train BigGAN. Its checkpoints used in our experiment are also provided in
eval_and_gan_ckpts
. Thus, to reproduce our results, the training of BigGAN is actually not necessary.
BigGAN: Run./CIFAR-100/GANs/BigGAN/scripts/launch_cifar100_ema.sh
- Implement each sampling method. Run
.sh
script(s) in the folder of each method.
cDR-RS and DRE-F-SP+RS: Run./scripts/run_exp_biggan.sh
for BigGAN.
DRS, DDLS, and Collab: Run./scripts/run_biggan.sh
for BigGAN.
Download eval_and_gan_ckpts.zip. Unzip eval_and_gan_ckpts.zip
to get eval_and_gan_ckpts
, and move eval_and_gan_ckpts
to ./ImageNet-100
. This folder includes the checkpoint of Inception-V3 for evaluation.
- Train BigGAN-deep. Its checkpoints used in our experiment are also provided in
eval_and_gan_ckpts
. Thus, to reproduce our results, the training of BigGAN is actually not necessary.
BigGAN: Run./ImageNet-100/GANs/BigGAN/scripts/launch_imagenet-100_deep.sh
- Implement each sampling method. Run
.sh
script(s) in the folder of each method.
cDR-RS and DRE-F-SP+RS: Run./scripts/run_exp_biggan.sh
for BigGAN.
DRS, DDLS, and Collab: Run./scripts/run_biggan.sh
for BigGAN.
Download eval_and_gan_ckpts.zip. Unzip eval_and_gan_ckpts.zip
to get eval_and_gan_ckpts
, and move eval_and_gan_ckpts
to ./UTKFace
. This folder includes the checkpoint of AE and ResNet-34 for evaluation. It also includes the checkpoint of CcGAN (SVDL+ILI).
Run ./scripts/run_train.sh
in each folder.
Download eval_and_gan_ckpts.zip. Unzip eval_and_gan_ckpts.zip
to get eval_and_gan_ckpts
, and move eval_and_gan_ckpts
to ./RC-49
. This folder includes the checkpoint of AE and ResNet-34 for evaluation. It also includes the checkpoint of CcGAN (SVDL+ILI).
Run ./scripts/run_train.sh
in each folder.
Please refer to https://github.com/UBCDingXin/improved_CcGAN.
Some codes are borrowed from the following repositories.
To implement ACGAN, we refer to https://github.com/sangwoomo/GOLD.
To implement SNGAN, we refer to https://github.com/christiancosgrove/pytorch-spectral-normalization-gan and https://github.com/pfnet-research/sngan_projection.
To implement BigGAN, we refer to https://github.com/ajbrock/BigGAN-PyTorch.
To implement CcGANs, we refer to https://github.com/UBCDingXin/improved_CcGAN.
To implement GOLD, we refer to https://github.com/sangwoomo/GOLD.
To implement Collab, we refer to https://github.com/YuejiangLIU/pytorch-collaborative-gan-sampling.
To implement DRS and DRE-F-SP+RS, we refer to https://github.com/UBCDingXin/DDRE_Sampling_GANs.
To implement DDLS, we refer to https://github.com/JHpark1677/CGAN-DDLS and https://github.com/Daniil-Selikhanovych/ebm-wgan/blob/master/notebook/EBM_GAN.ipynb.