-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enhance prune method for RNN #8062
Labels
预测
原名Inference,包含Capi预测问题等
Comments
This was referenced Feb 5, 2018
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When I am implementing the inference example for test_rnn_encoder_decoder.
The prune() method gives error when save_inference_model.
This is because prune() method currently does not take into account of RNN op.
See some discussions here
Will work on this together with #8059
The text was updated successfully, but these errors were encountered: