-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
precision parameter in Trainer does not accept "16" passed as str #16011
Comments
I guess this is due to this line:
It should probably be something like:
|
Hi @pkubik The error message can be misleading, I agree. We can consider to change it to remove the quotes.
What library are you working with? Another related issue that proposes an overhaul: #9956 |
for consistency I would extend support for |
We actually tried to use this with hydra with schema defined in a dataclass. We probably could find some nasty workaround. I do not claim that my case is sufficient justification to make a decision here :). I just saw this inconsistency with the error message and thought that accepting |
Yes. A contribution is welcome! |
@pkubik The string value is now also supported as input (with the next release 1.9.0). |
Bug description
When you try to create Trainer with wrong value for
precision
it provides correct values as:This indicate that the values are supposed to be string. However, later when I pass
"16"
as a value. It fails with:Only value
16
passed as an integer works right now. It causes problems with some configuration frameworks or hyperparam optimization libraries. They often do not work with unions.How to reproduce the bug
Error messages and logs
Environment
More info
Also checked on newest pytorch_lightning
pytorch_lightning-1.8.4.post0
.Works fine for following snippet:
cc @Borda @carmocca @justusschock @awaelchli
The text was updated successfully, but these errors were encountered: