-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Intel XPU device support to generate and serve #1361
base: main
Are you sure you want to change the base?
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchchat/1361
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ✅ No FailuresAs of commit 4d16351 with merge base 2cf1a17 (): This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Hi @jenniew! Thank you for your pull request and welcome to our community. Action RequiredIn order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks! |
Is there a way to run at least a few simple tests on an xpu to avoid inadvertent breakage? |
For generate, run a simple test by |
PYTORCH_NIGHTLY_VERSION=dev20241002 | ||
if [[ -x "$(command -v xpu-smi)" ]]; | ||
then | ||
PYTORCH_NIGHTLY_VERSION=dev20241001 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why does xpu need an older PYTORCH_NIGHTLY?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
when install torch==2.6.0.dev20241002 and torchvision==0.20.0.dev20241002+xpu, it will get error:
ERROR: Cannot install torch==2.6.0.dev20241002 and torchvision==0.20.0.dev20241002+xpu because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested torch==2.6.0.dev20241002
torchvision 0.20.0.dev20241002+xpu depends on torch==2.6.0.dev20241001
So for xpu, I changed the torch nightly version to dev20241001
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me see if I can get you a fresher version on XPU, the torch/vision discrepancy shouldn't be a normal thing
REQUIREMENTS_TO_INSTALL=( | ||
torch=="2.6.0.${PYTORCH_NIGHTLY_VERSION}" | ||
torchvision=="0.20.0.${VISION_NIGHTLY_VERSION}" | ||
torchtune=="0.3.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Context on the varying tune version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On xpu nightly URL, it does not have nightly version of torchtune, so just install 0.3.1 release for xpu environment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm we should add support for nightly, let me ping some torchtune folk
cc: @ebsmothers
Welcome to torchchat and thanks for adding @jenniew!! Super stoked to see that it didn't require much lift to get XPU set up. Added a few questions on the versioning difference. What device did you test on btw? Tagging a few folk who I'm trying to help taking a larger role |
@jenniew Also do you mind filling out the CLA? It'll allow you to contribute to Meta repos |
Yes, I just signed the CLA |
I tested on Intel Data Center GPU Max 1100. |
Just an update:
|
Would be nice to run as a test as well. Could be as easy as enabling and adding a runner for xpu to test-readme-pr.yml if it is available. Alternatively, may require a copy of that file because the test spec has several target-runner related specs, with updates to all fields for values representative of xpu environment:
|
Add XPU device support exclude distributed mode, workflow, documentation.