-
Notifications
You must be signed in to change notification settings - Fork 431
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keep activations in bidirectional LSTM (fixes: https://github.com/onnx/tensorflow-onnx/issues/2211) #2234
Keep activations in bidirectional LSTM (fixes: https://github.com/onnx/tensorflow-onnx/issues/2211) #2234
Conversation
347d629
to
fbba418
Compare
Thanks for your contributions. Yes, please follow the way in bigru_rewriter.py to read activations from those LSTMs. |
0dc27ce
to
6e23f3f
Compare
thanks@fatcat-z |
The CI failures are caused by tf 1.15. For compatibility, we might need to check if they are None before we call decode() method. |
881c3b3
to
c1115d0
Compare
good catch @fatcat-z , I made the change and the tests passed but now it differs a bit from the bigru implem. If you have any suggestions please make it via GitHub code review suggestion |
* Support ResizeArea op. Signed-off-by: Jay Zhang <jiz@microsoft.com> --------- Signed-off-by: Jay Zhang <jiz@microsoft.com> Signed-off-by: Me <me@example.com>
Signed-off-by: Me <me@example.com>
Signed-off-by: Me <me@example.com>
Signed-off-by: Me <me@example.com>
0f2d82b
to
01ea598
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for your contributions!
The forward and backward LSTMs have their correct activations but as soon as we rewrite it to a bidirectional this attribute is skipped and we get the default activations.
I am not sure how to write it in the
rewriter
file, you have something similar in the bigru so let me know if you prefer the syntax with decode and comprehension list