-
-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: the most recent xla nightly is breaking vllm on TPU #12451
Labels
bug
Something isn't working
Comments
cc @lsy323 |
Hi @hosseinsarshar, the torchxla nightly and torch nightly may not be compatible, due to c++ symbol issue. Please see these lines for details. If you uninstall torch 2.7 and install torch from |
DarkLight1337
pushed a commit
that referenced
this issue
Jan 28, 2025
rasmith
pushed a commit
to rasmith/vllm
that referenced
this issue
Jan 30, 2025
Isotr0py
pushed a commit
to Isotr0py/vllm
that referenced
this issue
Feb 2, 2025
…llm-project#12453) Signed-off-by: Isotr0py <2037008807@qq.com>
NickLucche
pushed a commit
to NickLucche/vllm
that referenced
this issue
Feb 7, 2025
ShangmingCai
pushed a commit
to ShangmingCai/vllm
that referenced
this issue
Feb 10, 2025
GWS0428
pushed a commit
to GWS0428/VARserve
that referenced
this issue
Feb 12, 2025
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Your current environment
The output of `python collect_env.py`
Model Input Dumps
No response
🐛 Describe the bug
The recent change to
20250124
is causing vllm to break -Before submitting a new issue...
The text was updated successfully, but these errors were encountered: