You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In xformers 0.0.20, I am getting the runtime error: mem_efficient_attention_backward_cutlass does not have a deterministic implementation. How do I solve this as I want determinism
The text was updated successfully, but these errors were encountered:
bertmaher
pushed a commit
to bertmaher/xformers
that referenced
this issue
Dec 20, 2024
…o & check pt FA compatibility at runtime (facebookresearch#1128)
* add torch_fa_switch to metadata info
* check if efficient_attention_forward_cutlass op is defined before looking for an implementation & refactor and reuse the is_pt_flash_compatible function
* fix _cpp_lib black linter & fix flake8 for flash.py
* added back _get_flash_version check
* use os.path.join for pt_attn_compat_file_path build
* simplified code
* fix mmypy linter
* check PT cutlass compatibility at import time
❓ Questions and Help
In xformers 0.0.20, I am getting the runtime error: mem_efficient_attention_backward_cutlass does not have a deterministic implementation. How do I solve this as I want determinism
The text was updated successfully, but these errors were encountered: