- Generative Adversarial Network
- Weight regularization losses and dropout
- I'm not sure yet whether they help or hurt...
- at which places should i put dropout
- Conditioning on data attributes (labels etc.)
- usually the conditioning vector is just the one-hot label
- can also be dense vector calculated from several additional data
- not sure where to best include that? each layer or just append once
- One-sided label smoothing
- Feature Matching
- I hope my code here is correct.
- Default architecture as in DCGAN
- ReLU in Generator (Tanh for final)
- Leaky ReLU in Discriminator
- No pooling layers
- Batch Normalization
- No fully connected layers
- Minbatch discrimination
- Historical Averaging
- Virtual Batch Normalization