Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

xformers 0.0.20 memoery efficient attent cutalss backward non deterministic #1128

Open
Bavesh-B opened this issue Oct 13, 2024 · 0 comments
Open

Comments

@Bavesh-B
Copy link

❓ Questions and Help

In xformers 0.0.20, I am getting the runtime error: mem_efficient_attention_backward_cutlass does not have a deterministic implementation. How do I solve this as I want determinism

bertmaher pushed a commit to bertmaher/xformers that referenced this issue Dec 20, 2024
…o & check pt FA compatibility at runtime (facebookresearch#1128)

* add torch_fa_switch to metadata info

* check if efficient_attention_forward_cutlass op is defined before looking for an implementation & refactor and reuse the is_pt_flash_compatible function

* fix _cpp_lib black linter & fix flake8 for flash.py

* added back _get_flash_version check

* use os.path.join for pt_attn_compat_file_path build

* simplified code

* fix mmypy linter

* check PT cutlass compatibility at import time
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant