Pytorch implementation of ProGAN as explained in the [original paper] (https://research.nvidia.com/publication/2017-10_Progressive-Growing-of)
The current code it's been designed to train the GAN on the CelebA dataset.
These are some of the faces I've generated. They're 32x32 faces. My GPU run out of memory in the transition to 64x64.
The main.py
file is the entry point to the code.
You can use it to both train a model and also to resume training if you need.
A sample run looks like this
python3 main.py
--data-path=<path to the folder with all the CelebA images>
--alternating-step=5000 #between fade and stabilise stages
--debug-step=250 #to save debug images
--save-step=1000 #to save the networks
--max-checkpoints=10 #max number of saved networks
--batch-size=24
--final-size=256
--crop-size=256
The command creates a folder called proGAN_<date>_<time>
that contains 4
elements:
img/
, folder where debugging images are storedlog/
, unused for nowmodels/
, where checkpoints are savedconfig.json
, file that contains all configurations, used to resume training
It's quite possible there's some problem, you need to stop your run, or it breaks because your GPU runs out of memory (progressive GAN grows in different stages so it uses more GPU memory over time). In those cases it's useful to resume training from last checkpoint.
python3 main.py
--data-path=<this option is mandatory but you can leave it empty>
--resume-training=<path to>/config.json
--batch-size=12
It's mandatory to include the --data-path
option for now but you can just
leave it empty, It must be the first option.
--resume-training
must point to the config.json
file of the
training you want to resume.
Under the hood, the config.json
file is being read and the configuration
options imported.
But you can specify another value for a concrete option, for example
--batch-size
if you set it after calling --resume-training
.
Order matters, any option you specify after --resume-training
will
override the value stored in config.json
.
That may or may not be what you wanted.