Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keep activations in bidirectional LSTM (fixes: https://github.com/onnx/tensorflow-onnx/issues/2211) #2234

Merged

Conversation

MaximeChurin
Copy link
Contributor

The forward and backward LSTMs have their correct activations but as soon as we rewrite it to a bidirectional this attribute is skipped and we get the default activations.

I am not sure how to write it in the rewriter file, you have something similar in the bigru so let me know if you prefer the syntax with decode and comprehension list

@MaximeChurin
Copy link
Contributor Author

@fatcat-z

@fatcat-z
Copy link
Collaborator

Thanks for your contributions.

Yes, please follow the way in bigru_rewriter.py to read activations from those LSTMs.

@MaximeChurin MaximeChurin force-pushed the keep_activations_in_bidirectional_lstm branch from 0dc27ce to 6e23f3f Compare October 11, 2023 08:15
@MaximeChurin
Copy link
Contributor Author

thanks@fatcat-z
It should be ok now, let me know

@fatcat-z
Copy link
Collaborator

thanks@fatcat-z It should be ok now, let me know

The CI failures are caused by tf 1.15. For compatibility, we might need to check if they are None before we call decode() method.

@MaximeChurin MaximeChurin force-pushed the keep_activations_in_bidirectional_lstm branch 2 times, most recently from 881c3b3 to c1115d0 Compare October 12, 2023 08:29
@MaximeChurin
Copy link
Contributor Author

good catch @fatcat-z , I made the change and the tests passed but now it differs a bit from the bigru implem. If you have any suggestions please make it via GitHub code review suggestion

tests/test_lstm.py Outdated Show resolved Hide resolved
Me and others added 6 commits October 12, 2023 13:58
Signed-off-by: Me <me@example.com>
* Support ResizeArea op.

Signed-off-by: Jay Zhang <jiz@microsoft.com>

---------

Signed-off-by: Jay Zhang <jiz@microsoft.com>
Signed-off-by: Me <me@example.com>
Signed-off-by: Me <me@example.com>
Signed-off-by: Me <me@example.com>
fix
Signed-off-by: Me <me@example.com>
Signed-off-by: Me <me@example.com>
@MaximeChurin MaximeChurin force-pushed the keep_activations_in_bidirectional_lstm branch from 0f2d82b to 01ea598 Compare October 12, 2023 11:58
Copy link
Collaborator

@fatcat-z fatcat-z left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for your contributions!

@fatcat-z fatcat-z enabled auto-merge (squash) October 12, 2023 12:23
@fatcat-z fatcat-z merged commit ca17b3c into onnx:main Oct 12, 2023
41 of 51 checks passed
@MaximeChurin MaximeChurin deleted the keep_activations_in_bidirectional_lstm branch October 12, 2023 13:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants