We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A bug is reported in SAGAN that check_for_unused_update_ops=True raise a ValueError. I find it suppose all ema buffers in bn to be unused update ops.
check_for_unused_update_ops=True
However there is no API in current SAGAN example to disable this check since it has ben encapsulated. So I set the kwarg in original tfgan implementation https://github.com/tensorflow/gan/blob/master/tensorflow_gan/python/train.py#L974 to False. The code runs normally then.
My tf version is 2.0.0 and tfgan is the current version.
The text was updated successfully, but these errors were encountered:
Update: The bug is reported on 1080Ti with build-in GPU estimator. When I run the code on TPUs, everything is fine.
Hope this issue could help you!
Sorry, something went wrong.
Do you experience this with TF 1.X?
No branches or pull requests
A bug is reported in SAGAN that
check_for_unused_update_ops=True
raise a ValueError. I find it suppose all ema buffers in bn to be unused update ops.However there is no API in current SAGAN example to disable this check since it has ben encapsulated. So I set the kwarg in original tfgan implementation https://github.com/tensorflow/gan/blob/master/tensorflow_gan/python/train.py#L974 to False. The code runs normally then.
My tf version is 2.0.0 and tfgan is the current version.
The text was updated successfully, but these errors were encountered: