Conversation
src/transformers/training_args.py
Outdated
|
|
||
| def __setattr__(self, name, value): | ||
| # Once fully through the `__post_init__`, `TrainingArguments` are immutable | ||
| if getattr(self, "_frozen", False): |
There was a problem hiding this comment.
Since adding _frozen to the dataclass args earlier would make it show up as an available option to pass through, we don't set it until the very end of __post_init__, hence the getattr instead of setting it earlier.
|
The documentation is not available anymore as the PR was closed or merged. |
sgugger
left a comment
There was a problem hiding this comment.
Nice! Now to see what breaks with this 😅
| exit(1) | ||
|
|
||
| trainer.args.eval_accumulation_steps = 2 | ||
| trainer.args._set_value("eval_accumulation_steps", 2) |
There was a problem hiding this comment.
It's better to create a new set of training args here. People sometimes look for inspiration in our tests and we definitely don't want to advertise that method.
There was a problem hiding this comment.
Done, there's also a method in dataclasses that lets us change frozen params the proper way with a new set of args overriding it
3e3e1ee to
02c2b62
Compare
|
@sgugger can you give it one final look please 😄 |
What does this PR do?
This PR ensures that the
TrainingArgumentsare a fully immutable dataclass after the__post_init__has been ran. We'll find that the tests suddenly fail now 😉 Should be merged after #25390Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@amyeroberts @sgugger