Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add padding for convTranspose2d ops of fast style transfer example #208

Merged
merged 1 commit into from
Mar 27, 2024

Conversation

huningxin
Copy link
Contributor

According to ONNX ConvTranspose op spec 1, the shape of the output can be explicitly set which will cause pads values to be auto generated. The previous code misses the padding that will cause WebNN output size validation error.

/cc @miaobin

According to ONNX ConvTranspose op spec [1], the shape of the output can
be explicitly set which will cause pads values to be auto generated. The
previous code misses the padding that will cause WebNN output size
validation error.

[1]: https://onnx.ai/onnx/operators/onnx__ConvTranspose.html
@huningxin huningxin requested a review from Honry March 27, 2024 02:54
@miaobin
Copy link
Contributor

miaobin commented Mar 27, 2024

LGTM and Thanks for the fix!

Copy link
Collaborator

@Honry Honry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@Honry Honry merged commit abbdf25 into webmachinelearning:master Mar 27, 2024
3 checks passed
Honry pushed a commit to Honry/webnn-samples that referenced this pull request May 15, 2024
…ebmachinelearning#208)

According to ONNX ConvTranspose op spec [1], the shape of the output can
be explicitly set which will cause pads values to be auto generated. The
previous code misses the padding that will cause WebNN output size
validation error.

[1]: https://onnx.ai/onnx/operators/onnx__ConvTranspose.html
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants