-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torch.clamp
issue due to update of pytorch 1.12.0
#237
Comments
CryptoSalamander
added a commit
to CryptoSalamander/Swin-Transformer
that referenced
this issue
Jul 14, 2022
I have simply solved it as follows:
|
CryptoSalamander
added a commit
to CryptoSalamander/Swin-Transformer
that referenced
this issue
Apr 11, 2023
CryptoSalamander
added a commit
to CryptoSalamander/Swin-Transformer
that referenced
this issue
Apr 11, 2023
CryptoSalamander
added a commit
to CryptoSalamander/Swin-Transformer
that referenced
this issue
Apr 11, 2023
CryptoSalamander
added a commit
to CryptoSalamander/Swin-Transformer
that referenced
this issue
Apr 11, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Since
torch.clamp
was updated in 1.12.0, the latest version of Pytorch,torch.clamp
's min, max argument should be loaded on same device with input tensor. pytorch/pytorch#77035I got an error with PyTorch 1.12.0 in this line,
Swin-Transformer/models/swin_transformer_v2.py
Line 156 in b720b41
Error :
In 1.11.0 this line works without problems because there was no argument-type promotion before 1.12.0!
but now, guess it should be fixed.
The text was updated successfully, but these errors were encountered: