-
Notifications
You must be signed in to change notification settings - Fork 196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding support for Ascend NPU #372
Conversation
This PR was verified locally, and the corresponding test results are as follows:
the output log
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks ! The PR looks good.
I left a few nits, the biggest blocker is the relatively odd cast to get the device back, if you have more details on why it's needed it'd be appreciated (I don't feel confortable merging right now with this hack)
f99398e
to
38c761c
Compare
Hi @Narsil thanks for your review. I've updated my code implementation. There are two main changes:
The updated test results are as follows:
Could you plz take a second look and trigger CI? thx :) |
LGTM thanks for this ! |
What does this PR do?
Thanks for creating this library. In our effort to streamline huggingface on Ascend devices, this PR is an important step.
What is Ascend NPU and
torch_npu
torch_npu
is an officially recognized pytorch integration plugin to support ascend npu using the pytorch framework. Ref: Improved third-party device support.cc @Narsil, @LysandreJik and @zhangsibo1129