Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance prune method for RNN #8062

Closed
kexinzhao opened this issue Feb 2, 2018 · 0 comments · Fixed by #8176
Closed

Enhance prune method for RNN #8062

kexinzhao opened this issue Feb 2, 2018 · 0 comments · Fixed by #8176
Assignees
Labels
预测 原名Inference,包含Capi预测问题等

Comments

@kexinzhao
Copy link
Contributor

kexinzhao commented Feb 2, 2018

When I am implementing the inference example for test_rnn_encoder_decoder.

The prune() method gives error when save_inference_model.
This is because prune() method currently does not take into account of RNN op.

See some discussions here

Will work on this together with #8059

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
预测 原名Inference,包含Capi预测问题等
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant