-
Notifications
You must be signed in to change notification settings - Fork 431
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bidirectional LSTM, activation functions are not converted #2211
Labels
bug
An unexpected problem or unintended behavior
Comments
quentin-baduel-numalis
added
the
bug
An unexpected problem or unintended behavior
label
Jul 25, 2023
MaximeChurin
pushed a commit
to MaximeChurin/tensorflow-onnx
that referenced
this issue
Sep 11, 2023
MaximeChurin
pushed a commit
to MaximeChurin/tensorflow-onnx
that referenced
this issue
Sep 11, 2023
Thanks for the lead, I had the same issue. I have a PR ongoing that should fix it: #2234 |
MaximeChurin
pushed a commit
to MaximeChurin/tensorflow-onnx
that referenced
this issue
Oct 11, 2023
MaximeChurin
pushed a commit
to MaximeChurin/tensorflow-onnx
that referenced
this issue
Oct 12, 2023
MaximeChurin
pushed a commit
to MaximeChurin/tensorflow-onnx
that referenced
this issue
Oct 12, 2023
MaximeChurin
pushed a commit
to MaximeChurin/tensorflow-onnx
that referenced
this issue
Oct 12, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the bug
Hello, thank you for developing this tool!
When Converting a Keras model containing a Bidirectional LSTM layer, the activation function attribute (activations) is not set in the corresponding lstm onnx operator.
The bug does not seem to appear if we are not in the "Bidirectional" context.
System information
To Reproduce
1: Creating the Keras network with the activation functions set to "relu":
2: converting the model
3: checking the converted model
While inspecting the resulting onnx model, we can see that the attribute "activations" of the lstm node is not set and will default to "sigmoid/tanh" rather than the expected "relu" functions
Screenshots
The text was updated successfully, but these errors were encountered: