Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX importer ModelParams bug #4521

Closed
anwang2009 opened this issue Dec 14, 2019 · 4 comments
Closed

ONNX importer ModelParams bug #4521

anwang2009 opened this issue Dec 14, 2019 · 4 comments

Comments

@anwang2009
Copy link
Contributor

Using the onnx importer on mobilenetv2-1.0 with opset 7, relay_params appears to have an extra parameter as compared to relay_model's parameters.

Here's a repro of the bug:

import tempfile
import urllib.request
from tvm.relay import Module
import onnx
from tvm.relay.frontend import from_onnx

def onnx_workload():
    tmpfile = tempfile.NamedTemporaryFile('w+b')
    mobilenet_url = 'https://s3.amazonaws.com/onnx-model-zoo/mobilenet/mobilenetv2-1.0/mobilenetv2-1.0.onnx'
    urllib.request.urlretrieve(mobilenet_url, tmpfile.name)
    tmpfile.seek(0)
    return tmpfile

onnx_model = onnx.load(onnx_workload())
input_shape = {"data": [1, 3, 224, 224]}
relay_model, relay_params = from_onnx(onnx_model, shape=input_shape)


var_map = {}
for fn_param in relay_model['main'].params:
    var_map[fn_param.name_hint] = fn_param
    
# reshape_attr_tensor421 is in params but not the module
print("reshape_attr_tensor421 is in relay_params", "reshape_attr_tensor421" in relay_params)
print("reshape_attr_tensor421 is in relay_model", var_map.get("reshape_attr_tensor421") is not None)
@anwang2009
Copy link
Contributor Author

anwang2009 commented Dec 14, 2019

Poking around a little, it looks like reshape_attr_tensor421 persists in self._nodes and self._params for the entirety of from_onnx.

@jroesch @jwfromm

@jwfromm
Copy link
Contributor

jwfromm commented Dec 14, 2019

This is strange and I'll try to figure out where that parameter is coming from. Just curious though, does this have any negative effect besides having to keep one extra (size 2) parameter around?

@jwfromm
Copy link
Contributor

jwfromm commented Dec 14, 2019

Ok so what's happening is Onnx treats the shape of Reshape operators as parameters that naturally get added to the dictionary of global parameters during conversion. In Relay, however, the output shape of reshape is fixed so the parameters aren't needed. Instead of just peeking at the shape and leaving it in parameters, we can instead pop them out of the parameter dictionary like so (line 466 in onnx.py):

shape = tuple(params.pop(inputs[1].name_hint).asnumpy())

If we think this is a bug worth fixing (or if its even technically a bug at all) one of us can open a one line fix PR.

@jwfromm
Copy link
Contributor

jwfromm commented Dec 15, 2019

Fixed in PR #4524

@tqchen tqchen closed this as completed Dec 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants