-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] No torch.sqrt support in Hidet ? #386
Labels
enhancement
New feature or request
Comments
Hi @yaoyaoding, Thanks for your kind reply! I will have a try. |
vadiklyutiy
pushed a commit
that referenced
this issue
Dec 19, 2024
In gpt-neo model (related issue: CentML/hidet#338) torch.where accepts tensors with different dtypes. Added type casting to fix the above issue. --------- Co-authored-by: Zhumakhan <nazirzhumakhan@gmail,.com>
vadiklyutiy
pushed a commit
that referenced
this issue
Dec 20, 2024
In gpt-neo model (related issue: CentML/hidet#338) torch.where accepts tensors with different dtypes. Added type casting to fix the above issue. --------- Co-authored-by: Zhumakhan <nazirzhumakhan@gmail,.com>
vadiklyutiy
pushed a commit
that referenced
this issue
Dec 26, 2024
In gpt-neo model (related issue: CentML/hidet#338) torch.where accepts tensors with different dtypes. Added type casting to fix the above issue. --------- Co-authored-by: Zhumakhan <nazirzhumakhan@gmail,.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I'm trying to optimize a
SelfAttention
module, but there is no support fortorch.sqrt
function. The code is as follows:where I use
in
LayNorm
moduleThe error information is as follows:
I'm wondering if there is any method to support
torch.sqrt
function. I noticed that there is relevant abtraction inir
forsqrt
function. However, the sqrt function inhidet\python\hidet\ir\primitives\math.py
presentsraise NotImplementedError()
.The text was updated successfully, but these errors were encountered: