-
Notifications
You must be signed in to change notification settings - Fork 26.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make training args fully immutable #25435
Conversation
src/transformers/training_args.py
Outdated
|
||
def __setattr__(self, name, value): | ||
# Once fully through the `__post_init__`, `TrainingArguments` are immutable | ||
if getattr(self, "_frozen", False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since adding _frozen
to the dataclass args earlier would make it show up as an available option to pass through, we don't set it until the very end of __post_init__
, hence the getattr
instead of setting it earlier.
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! Now to see what breaks with this 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice ❄️ !
@@ -205,7 +205,7 @@ def compute_metrics(p: EvalPrediction) -> Dict: | |||
logger.error(p.metrics) | |||
exit(1) | |||
|
|||
trainer.args.eval_accumulation_steps = 2 | |||
trainer.args._set_value("eval_accumulation_steps", 2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's better to create a new set of training args here. People sometimes look for inspiration in our tests and we definitely don't want to advertise that method.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, there's also a method in dataclasses
that lets us change frozen params the proper way with a new set of args overriding it
3e3e1ee
to
02c2b62
Compare
@sgugger can you give it one final look please 😄 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, looking great!
* Make training args fully immutable * Working tests, PyTorch * In test_trainer * during testing * Use proper dataclass way * Fix test * Another one * Fix tf * Lingering slow * Exception * Clean
What does this PR do?
This PR ensures that the
TrainingArguments
are a fully immutable dataclass after the__post_init__
has been ran. We'll find that the tests suddenly fail now 😉 Should be merged after #25390Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@amyeroberts @sgugger