Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't install latest flashinfer for torch 2.5 #692

Open
sleepwalker2017 opened this issue Dec 23, 2024 · 3 comments
Open

Can't install latest flashinfer for torch 2.5 #692

sleepwalker2017 opened this issue Dec 23, 2024 · 3 comments
Assignees

Comments

@sleepwalker2017
Copy link

I use the cmd pip install flashinfer -i https://flashinfer.ai/whl/cu124/torch2.4/ but I only get v0.1.6, how can I install the latest version ?

The torch version is forced by vllm to be 2.5.1.

@Narsil
Copy link

Narsil commented Jan 22, 2025

Is this fixed ? It seems the wheels aren't yet uploaded no ?

https://flashinfer.ai/whl/cu124/

@zwhe99
Copy link

zwhe99 commented Jan 23, 2025

same here

@DeJayDev
Copy link

DeJayDev commented Feb 5, 2025

looks like it was fixed by #694 and was only uploaded recently, maybe with the v0.2.0-post2 release?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants