Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with flash attention when upgrading to torch 2.3.1 #522

Closed
remichu-ai opened this issue Jun 27, 2024 · 1 comment
Closed

Issue with flash attention when upgrading to torch 2.3.1 #522

remichu-ai opened this issue Jun 27, 2024 · 1 comment

Comments

@remichu-ai
Copy link

I alwas encounter this error after upgrading flash attention. However, noted from the wheel you have bumped torch version to 2.3.1

Do you encounter this issue with flash attention on torhc 2.3.1?

  File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/exllamav2/__init__.py", line 3, in <module>
    from exllamav2.model import ExLlamaV2
  File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/exllamav2/model.py", line 41, in <module>
    from exllamav2.attn import ExLlamaV2Attention, has_flash_attn, has_xformers
  File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/exllamav2/attn.py", line 30, in <module>
    import flash_attn
  File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/flash_attn/__init__.py", line 3, in <module>
    from flash_attn.flash_attn_interface import (
  File "/home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
    import flash_attn_2_cuda as flash_attn_cuda
ImportError: /home/remichu/miniconda3/envs/mlenv/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

@remichu-ai
Copy link
Author

I resolve it by running the following command, though i have no clue how it works:

pip uninstall flash-attn
FLASH_ATTENTION_FORCE_BUILD=TRUE pip install flash-attn

i found this instruction from this thread:
oobabooga/text-generation-webui#4182

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant