We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
目前越来越多的产品与项目都在移动端上有使用深度学习模型进行预测的需求,如果paddle在移动端有足够的优势就可以替代我们目前一些产品中使用其他框架的预测模块。在手机端产品中由于计算资源非常有限,所以对性能要求可能比较苛刻,不知道paddle在移动端能否达到以下的要求: 1.在手机端整个框架打包后大小在1MB左右。 2.在手机端预测时在输入图片大小为224x224情况下,内存消耗小于50MB。(主流模型均可) 3.关于手机端预测速度因为不同手机计算资源差异较大,就希望越快越好了。
最后祝PaddlePaddle越做越好:)
The text was updated successfully, but these errors were encountered:
The problem mentioned here has been solved, so I close this issue.
Sorry, something went wrong.
NHZlX
Xreki
hedaoyuan
No branches or pull requests
目前越来越多的产品与项目都在移动端上有使用深度学习模型进行预测的需求,如果paddle在移动端有足够的优势就可以替代我们目前一些产品中使用其他框架的预测模块。在手机端产品中由于计算资源非常有限,所以对性能要求可能比较苛刻,不知道paddle在移动端能否达到以下的要求:
1.在手机端整个框架打包后大小在1MB左右。
2.在手机端预测时在输入图片大小为224x224情况下,内存消耗小于50MB。(主流模型均可)
3.关于手机端预测速度因为不同手机计算资源差异较大,就希望越快越好了。
最后祝PaddlePaddle越做越好:)
The text was updated successfully, but these errors were encountered: