Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix soft inpainting on mps and xpu, torch_utils.float64 #15815

Merged
merged 3 commits into from
Jun 8, 2024

Conversation

w-e-w
Copy link
Collaborator

@w-e-w w-e-w commented May 16, 2024

Description

torch_utils.float64
return torch.float64 or torch.float32 base on torch.Tensor.device.type

not sure if this is the best solution to solve this ideally
there should be a way to solve this globally

note: logically this should workrl but as I don't have mps or xpu device so this is technically not fully tested

Checklist:

w-e-w added 3 commits May 16, 2024 23:16
return torch.float64 if device is not mps or xpu, else return torch.float32
@w-e-w w-e-w requested a review from AUTOMATIC1111 as a code owner May 16, 2024 14:32
@w-e-w w-e-w changed the title Torch float64 or float32 fix soft inpainting on mps and xpu, torch_utils.float64 May 16, 2024
@AUTOMATIC1111 AUTOMATIC1111 merged commit b4723bb into dev Jun 8, 2024
6 checks passed
@AUTOMATIC1111 AUTOMATIC1111 deleted the torch-float64-or-float32 branch June 8, 2024 08:07
@lawchingman lawchingman mentioned this pull request Oct 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants