English | 简体中文
After training the model by PaddleSeg, we also support exporting model with ONNX format. This tutorial provides an example to introduce it.
For the complete method of exporting ONNX format models, please refer to Paddle2ONNX。
Refer to document to export model, and save the exported inference model to the output folder, as follows.
./output
├── deploy.yaml # deployment-related profile
├── model.pdmodel # topology file of inference model
├── model.pdiparams # weight file of inference model
└── model.pdiparams.info # additional information, generally do not need attention to this file
Install Paddle2ONNX (version 0.6 or higher).
pip install paddle2onnx
Execute the following command to export the prediction model in the output folder to an ONNX format model by Paddle2ONNX.
paddle2onnx --model_dir output \
--model_filename model.pdmodel \
--params_filename model.pdiparams \
--opset_version 11 \
--save_file output.onnx
The exported ONNX format model is saved as output.onnx file.
Reference documents: