You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for the awesome repo! I had a couple of questions about the implementation of the loss function.
In the paper you multiply the entire GAN loss by the adaptive weight, lambda. The adaptive loss is a function of the entire GAN loss (L_GAN). Two questions:
In the code, only the generator loss is multiplied by the adaptive weight (
For me, the adaptive weight gets fairly high and so "loss" is far higher than "d_loss". I think this is why some people are not seeing the discriminator loss decrease during training e.g. Has anyone succeeded in reproducing the results? #44)
In the code, the adaptive weight, lambda, is only a function of the generator loss (whereas I thought the GAN loss was a function of both the generator and discriminator loss).
Could you offer any advice here?
Thank you
The text was updated successfully, but these errors were encountered:
Hi,
Thanks for the awesome repo! I had a couple of questions about the implementation of the loss function.
In the paper you multiply the entire GAN loss by the adaptive weight, lambda. The adaptive loss is a function of the entire GAN loss (L_GAN). Two questions:
taming-transformers/taming/modules/losses/vqperceptual.py
Line 107 in 9d17ea6
Could you offer any advice here?
Thank you
The text was updated successfully, but these errors were encountered: