-
Notifications
You must be signed in to change notification settings - Fork 6.8k
[v1.x] ONNX Fixes for some NLP models #19973
Conversation
Zha0q1
commented
Mar 3, 2021
- fix the output missing issue when the output is in the iniatilizer
- fix shape issue in zeros_ and ones_
- fix one_hot dtype issue
- fix scalar_op_helper multi-dimensional tensor not being flattened issue
- fix leaky_relu
Hey @Zha0q1 , Thanks for submitting the PR
CI supported jobs: [sanity, windows-cpu, unix-cpu, centos-gpu, centos-cpu, windows-gpu, clang, website, unix-gpu, edge, miscellaneous] Note: |
@@ -822,7 +822,7 @@ def convert_leakyrelu(node, **kwargs): | |||
inputs=input_nodes, | |||
outputs=[name], | |||
name=name) | |||
elif act_type in ('gelu'): | |||
elif act_type in ('gelu',): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we add ,
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
otherwise this is not a tuple, and elu
is in gelu
so we will never go into the else branch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we use a list for this instead? Why a tuple?
elif act_type in ['gelu']:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!