-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relay][Frontend] Keras Support #2336
Conversation
test_forward_multi_outputs() | ||
test_forward_reuse_layers() | ||
test_forward_rnn() | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we remove this line.
Jenkinsfile
Outdated
@@ -236,6 +236,17 @@ stage('Integration Test') { | |||
} | |||
} | |||
}, | |||
'frontend: GPU': { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@tqchen please check the Jenkinsfile change to add relay frontend tests. Do we have to merge it so that it can take effect?
Should we put all relay frontend tests in tests/python/frontend? I see mxnet tests in both tests/python/frontend and tests/python/relay/frontend
if pad_t == pad_b and pad_l == pad_r: | ||
params['padding'] = (pad_t, pad_l) | ||
else: | ||
inexpr = _op.nn.pad(data=inexpr, pad_width=( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we extend convolution’s padding param into 4D [top, left, bottom, right] so that we can avoid inserting pad? we have done it for pooling, see this file line 315.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, this is doable. It would be great if someone is interested in contributing this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am worry the compatibility too. NNVM / our relay FE uses 2D padding.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moreever, x86 backend assume padding is 2D too, which assume only pad_h and pad_w. So this is great, but seem that many places we should be careful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me, great work!
Thanks to @Huyuwei @kazum @jroesch @FrozenGene , this is merged. @FrozenGene let us discuss the additional padding option support as a separate thread |
@Huyuwei there is a test case error in master, which could be caused by our recent change of convention to const http://ci.tvm.ai:8080/blue/organizations/jenkins/tvm/detail/master/577/pipeline #2349 |
Keras -> relay frontend migrated from nnvm with one main change that SymbolTable is replaced by ExprTable.
@kazum @thefiddler welcome to review.