Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in the The operator 'aten::_linalg_solve_ex.result' in the aug_transforms function in the Data Augmentation #588

Open
Akhil-Raj opened this issue Jun 5, 2023 · 1 comment

Comments

@Akhil-Raj
Copy link

https://github.com/fastai/fastbook/blob/823b69e00aa1e1c1a45fe88bd346f11e8f89c1ff/02_production.ipynb#LL929C5-L931C58

In the lines highlighted above(See the 'Code' tab of .ipynb), I was getting this error :

NotImplementedError: The operator 'aten::_linalg_solve_ex.result' is not currently implemented for the MPS device. If you want this op to be added in priority during the prototype phase of this feature, please comment on pytorch/pytorch#77764. As a temporary fix, you can set the environment variable PYTORCH_ENABLE_MPS_FALLBACK=1 to use the CPU as a fallback for this op. WARNING: this will be slower than running natively on MPS.

A temporary solution is also mentioned in the issue : https://forums.fast.ai/t/lesson-2-troubleshoot-macbook-m1-issue/105584

Is there any permanent solution to it?

Akhil-Raj added a commit to Akhil-Raj/fastbook that referenced this issue Jun 5, 2023
…transforms function in the Data Augmentation

Fixes fastai#588
Akhil-Raj added a commit to Akhil-Raj/fastbook that referenced this issue Jun 5, 2023
…transforms function in the Data Augmentation

Fixes fastai#588
Akhil-Raj added a commit to Akhil-Raj/fastbook that referenced this issue Jun 5, 2023
…transforms function in the Data Augmentation

Fixes fastai#588
Akhil-Raj added a commit to Akhil-Raj/fastbook that referenced this issue Jun 5, 2023
…transforms function in the Data Augmentation

Fixes fastai#588
@gamedevCloudy
Copy link

The issue seems to be in:
Warp() function in fastai.vision.augment

Thus the workaround: other augments works by creating custom_augments that does not include Warp.

Including Warp() in custom augment gives same error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants