Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix inconsistent padding scheme between onnx and pytorch #41

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

maybeLee
Copy link

The scheme between torch.nn.functional.Pad and onnx's pad is different if the pads receive 8-dimensional padding.
Given input with size: (1,3,10,10) and pads=(1,1,2,2,3,3,4,4), F.Pad will result in size (9,9,14,12) (as declared in your test scripts). However, onnx will output result with size: (5,7,16,16) following their documentation.

Therefore, the pads parameter loaded from onnx_model.graph should be transformed to the PyTorch version so the padding size is correct.

Unfortunately, I find that the pads parameter will be placed in onnx_model.graph.initializer instead of the node's parameter, so a simple preprocess of Pad nodes' parameter is not feasible :(.

So I have to add an additional branch (which is ugly...) when loading the initializer parameter: if the targeting node is Pad we will check if the pads parameter needs to be preprocessed.

I write a program to exhibit this bug:
Through this code snippet, we can see that the correct output shape should be: (batch_size, 226, 226, 3) but ONNX2PyTorch will output (batch_size+1, 225, 225, 3)

Current fix can pass all existing tests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant