Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

If model use initializer, fuse_bn_into_conv is broken from v0.3.9 #133

Open
sa-kei728 opened this issue Apr 18, 2023 · 5 comments
Open

If model use initializer, fuse_bn_into_conv is broken from v0.3.9 #133

sa-kei728 opened this issue Apr 18, 2023 · 5 comments

Comments

@sa-kei728
Copy link

I tried to optimize model by using "fuse_bn_into_conv", generate optimized model incorrectly from v0.3.9.
I would say 807cff7 is the cause. this commit would work fine with constant node, but the initializer shouldn't work...
Why were these changes made?
I'd like to support for initializer as well since onnxoptimizer has "extract_constant_to_initializer" which I often use to avoid errors related to topological sort. At least I wouldn't like this change.

Confirmed Environment(part of pyproject.toml)

[tool.poetry.dependencies]
python = "3.11"
onnx = "1.13.1"
onnxoptimizer = "0.3.9"

Reproduce Code

  1. Download tiny-yolov3.onnx (this model is used initializer)
    https://github.com/onnx/models/blob/main/vision/object_detection_segmentation/tiny-yolov3/model/tiny-yolov3-11.onnx

  2. prepare onnx and onnxoptimizer

  3. create reproduce code

import onnx
from onnxoptimizer import optimize

# load model
model_path = "tiny-yolov3-11.onnx"
model = onnx.load(model_path)

# setting optimize passes
optimization_passes = ["fuse_bn_into_conv"]

# execute optimization
optimized_model = optimize(model, optimization_passes)

# save model
optimized_model_path = "tiny-yolov3-11-optimized.onnx"
onnx.save(optimized_model, optimized_model_path)

v0.3.9

v0_3_9

in case of v0.3.8

v0_3_8

@daquexian
Copy link
Member

Thanks for your issue! We'll consider how to fix it. Could you please try using onnxsim for now?

@sa-kei728
Copy link
Author

Thank you for quickly response!
I tried onnxsim for original and optimized model. The correct fusion model is now created!

@sa-kei728
Copy link
Author

I found another case.
If Conv has "bias" term, It seems that fuse_bn_into_conv fails from v0.3.9.
This case is unresolved by using onnxsim.

In general there is no such model, but the tool I was using seems to generate an onnx model that temporarily includes the bias term in the conv and then runs fuse_bn_into_conv, which fails in the optimizer after v0.3.9.
(I'd like to share this model and tool with you, but I cannot do so for some reasons. Sorry...)

There is a work-around for this case, it's no problem for now.
I comment to share the information.

@AndreyOrb
Copy link

@sa-kei728 What is the workaround you used?

@sa-kei728
Copy link
Author

@AndreyOrb
In my case, I used v0.3.8.

I would say onnx2json is also resolved this issue.
You may resolve to remove bias term in the target Conv Node and append Add Node after Conv.
(However I haven't tried it, so can't guarantee it works.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants