You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are 2 terms added together : one is logp which is always negative, and logdets which starts with a negative number during training, because of the minus sign in -obj the loss is positive, but after training for around 100k steps, both the logp and logdets increases. The logp term increases and becomes positive number and hence the final loss becomes negative around -3.xxx. Just wanted to know whether it is an expected behaviour ?
The text was updated successfully, but these errors were encountered:
I followed this repo, paper and glow paper. My loss looks like this
There are 2 terms added together : one is logp which is always negative, and logdets which starts with a negative number during training, because of the minus sign in -obj the loss is positive, but after training for around 100k steps, both the logp and logdets increases. The logp term increases and becomes positive number and hence the final loss becomes negative around -3.xxx. Just wanted to know whether it is an expected behaviour ?
The text was updated successfully, but these errors were encountered: