-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Softmax axis absent #466
Labels
Comments
Should it be: - MLOperand softmax(MLOperand input);
+ MLOperand softmax(MLOperand input, unsigned long axis); |
Ningxin: Indeed, I fixed my typo right before you wrote your comment 😅. |
3 tasks
inexorabletash
added a commit
to inexorabletash/webnn
that referenced
this issue
Apr 18, 2024
Frameworks (TensorFlow, PyTorch, ONNX) all accept an axis parameter. Most backends also support an axis, or it can be emulated with a reshape. As @fdwr wrote: So it's achievable in each backend... but it would move the pain from the caller down to where it can be handled efficiently. Fixes webmachinelearning#466
fdwr
added a commit
that referenced
this issue
Apr 25, 2024
* Add axis argument to softmax() Frameworks (TensorFlow, PyTorch, ONNX) all accept an axis parameter. Most backends also support an axis, or it can be emulated with a reshape. As @fdwr wrote: So it's achievable in each backend... but it would move the pain from the caller down to where it can be handled efficiently. Fixes #466 * revert activation example to softmax * validate softmax axis against inputs rank * update TOC headers * Update index.bs Co-authored-by: Dwayne Robinson <dwayner@microsoft.com> * camelCase not snake_case * Remove unnecessary condition * Update index.bs Co-authored-by: Dwayne Robinson <dwayner@microsoft.com> * Update index.bs Co-authored-by: Dwayne Robinson <dwayner@microsoft.com> * Update index.bs Co-authored-by: Dwayne Robinson <dwayner@microsoft.com> * Sketch of validation for activations * For gru() and lstm(), calculate gate descriptor, validate activations with it * fix some copy/pasta --------- Co-authored-by: Dwayne Robinson <dwayner@microsoft.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
(raised by @Honry in review https://github.com/microsoft/onnxruntime/pull/17665/files)
TF/PT/ONNX all take an
axis
parameter:tf.nn.softmax
torch.nn.Softmax
Softmax-13
...but WebNN's
softmax
does not, making it challenging to implement a caller's softmax in terms of the function of the same name in WebNN. It is possible (see here) via bracketing transposes+reshapes around it, but the transpose+reshape contortions are unfortunate, and they could be more efficiently implemented in the backend rather than in each framework.So it's achievable in each backend, even without any changes to the DML/XNNPack API's, but it would move the pain from the caller down to where it can be handled efficiently.
https://www.w3.org/TR/webnn/#api-mlgraphbuilder-softmax-method
The text was updated successfully, but these errors were encountered: