This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Spectralnormalization GAN seems to be seriously broken #13847
Comments
Thank you for submitting the issue! I'm labeling it so the MXNet community members can help resolve it. @mxnet-label-bot add [Python, Gluon, Example] @stu1130 Can you take a look? |
The problem is related to the update u matrix.
|
PR #13868 |
This was referenced Jan 14, 2019
|
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Description
The SN_GAN example in the gluon/examples directory does not seem to work. I am wondering if the spectral normalization code in it is correct at all.
1.) The example does not use HybridBlocks which shouldn't be too hard to do I would think
2.) Some of the default values do not match what they say in the description (latent_z defaults to 10 not to 100 as by description)
3.) The results I am getting are terrible even after training for over 80 epochs the latent space seems to collapse to a single class or a specific input image which is weired because spectral normalization should actually cause the opposite to happen according to the research papers cited
(See linked output example for what I am seeing)
Output example after 86 epochs
Environment info (Required)
MXNet 1.3 Anaconda Python 3.4.1 Windows 10 nVidia Geforce1080ti
The text was updated successfully, but these errors were encountered: