Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: flash_attn not found when running grounded_sam2_florence2_image_demo.py #35

Open
XiaowenZhang-kuku opened this issue Aug 25, 2024 · 6 comments

Comments

@XiaowenZhang-kuku
Copy link

Hello,

Thank you for your excellent work. When I try to run the following command:
python grounded_sam2_florence2_image_demo.py --pipeline object_detection_segmentation --image_path ./notebooks/images/cars.jpg
it gives me an error:
ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run pip install flash_attn

My environment is as follows::
CUDA=12.1
torch=2.3.1
transformers=4.33.2

Could you provide guidance on how to resolve this error?

image
@XiaowenZhang-kuku
Copy link
Author

I tried to manually install flash-attention from GitHub releases, but I couldn't find a version that is compatible with my CUDA and PyTorch setup.
I installed several versions of flash-attention, but the error persisted: ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn.

@rentainhe
Copy link
Collaborator

rentainhe commented Aug 25, 2024

Have you tried pip install flash_attn as the error info says? I think it will automatically detect the environment you used and choose a suitable version for your environment.

@XiaowenZhang-kuku
Copy link
Author

Have you tried pip install flash_attn as the error info says? I think it will automatically detect the environment you used and choose a suitable version for your environment.

Yes, this is my first attempt. However, error still exists.

@rentainhe
Copy link
Collaborator

Have you tried pip install flash_attn as the error info says? I think it will automatically detect the environment you used and choose a suitable version for your environment.

Yes, this is my first attempt. However, error still exists.

You mean you have successfully run pip install flash_attn but there're still the same package not found bug or you could not even install flash_attn

@XiaowenZhang-kuku
Copy link
Author

have successfully run pip install flash_attn but there're still the same package not found

I have successfully run pip install flash_attn but there're still the same package not found.

@rentainhe
Copy link
Collaborator

rentainhe commented Aug 26, 2024

Hi @XiaowenZhang-kuku , I think there are some issues have the same situation as you in flash_attn official repo, you can refer to this issue: Dao-AILab/flash-attention#453 to see if this can help you with your problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants