We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Torch op: layer_norm(Tensor input, SymInt[] normalized_shape, Tensor? weight=None, Tensor? bias=None, float eps=1e-05, bool cudnn_enable=True) -> Tensor softmax.int(Tensor self, int dim, ScalarType? dtype=None) -> Tensor Aten op : torch.ops.aten.layer_norm.default torch.ops.aten._softmax.default
The text was updated successfully, but these errors were encountered:
aten
This issue has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days
Sorry, something went wrong.
narendasan
peri044
gs-olive
No branches or pull requests
Torch op:
layer_norm(Tensor input, SymInt[] normalized_shape, Tensor? weight=None, Tensor? bias=None, float eps=1e-05, bool cudnn_enable=True) -> Tensor
softmax.int(Tensor self, int dim, ScalarType? dtype=None) -> Tensor
Aten op : torch.ops.aten.layer_norm.default
torch.ops.aten._softmax.default
The text was updated successfully, but these errors were encountered: