Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] Pyro is not wrapping LR schedulers for pytorch 2.0 #3200

Closed
sjfleming opened this issue Apr 23, 2023 · 2 comments
Closed

[bug] Pyro is not wrapping LR schedulers for pytorch 2.0 #3200

sjfleming opened this issue Apr 23, 2023 · 2 comments
Labels

Comments

@sjfleming
Copy link
Contributor

sjfleming commented Apr 23, 2023

Issue Description

With pytorch 2.0, the pyro.optim PyroLRScheduler wrappers no longer seem to be wrapping the pytorch learning rate schedulers and including them in the pyro.optim namespace.

from pyro import optim
optim.ExponentialLR
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: module 'pyro.optim' has no attribute 'ExponentialLR'

same for

import pyro.optim
pyro.optim.ExponentialLR
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: module 'pyro.optim' has no attribute 'ExponentialLR'

Environment

  • OS: Mac Monterey 12.6.3
  • PyTorch version: 2.0.0
  • Pyro version: 1.8.4
  • Python 3.8 and 3.9 (did not try higher)

Code Snippet

import pyro.optim
pyro.optim.ExponentialLR
@fritzo fritzo added the bug label Apr 23, 2023
@sjfleming
Copy link
Contributor Author

Oops, looks like @eb8680 already took care of this. Thanks for pointing that out @ordabayevy .

@sjfleming
Copy link
Contributor Author

Closed by #3167

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants