-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add gradient monitor callback #144
Conversation
Do you have an example? I tried to run this command, but I didn't see gradient logs in Tensorboard. CUDA_VISIBLE_DEVICES=0 \
python -m mart \
experiment=CIFAR10_CNN_Adv \
fit=false \
trainer=gpu \
+trainer.limit_test_batches=1 \
+callbacks@model.modules.input_adv_test.callbacks=gradient_monitor \
+model.modules.input_adv_test.callbacks.gradient_monitor.frequency=1 Update: patch and example in #164 |
There are no parameters in an Adversary ;) This is why I think it's better to modify the state_dict rather than playing tricks by hiding parameters in lists. |
Resolved in #160 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
What does this PR do?
This adds a gradient monitor callback so that one can make monitor the norm of different gradients.
Type of change
Please check all relevant options.
Testing
Please describe the tests that you ran to verify your changes. Consider listing any relevant details of your test configuration.
Before submitting
pre-commit run -a
command without errorsDid you have fun?
Make sure you had fun coding 🙃