You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I alwas encounter this error after upgrading flash attention. However, noted from the wheel you have bumped torch version to 2.3.1
Do you encounter this issue with flash attention on torhc 2.3.1?
File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/exllamav2/__init__.py", line 3, in <module>
from exllamav2.model import ExLlamaV2
File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/exllamav2/model.py", line 41, in <module>
from exllamav2.attn import ExLlamaV2Attention, has_flash_attn, has_xformers
File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/exllamav2/attn.py", line 30, in <module>
import flash_attn
File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/flash_attn/__init__.py", line 3, in <module>
from flash_attn.flash_attn_interface import (
File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
import flash_attn_2_cuda as flash_attn_cuda
ImportError: /home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
The text was updated successfully, but these errors were encountered:
I alwas encounter this error after upgrading flash attention. However, noted from the wheel you have bumped torch version to 2.3.1
Do you encounter this issue with flash attention on torhc 2.3.1?
The text was updated successfully, but these errors were encountered: