-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[External codegen] Add test cases for fused ops with manual annotation #4741
Conversation
}; | ||
|
||
Output ret; | ||
if (auto conv_call = DetectFusedConv2DBiasReLU(call)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure if we really want to handle fused op from relay for external codegen. This looks quite ad-hoc to me. You may have countless combinations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The idea is for it to serve as an example of handling fused ops inside external codegen. I assume dnnl backend itself is not meant to be used in production; The purpose is to be a more realistic example than CodegenC, so I thought why don't we add an example of how to handle fused ops. I never intended to cover other fusion cases.
Since we are trying to be so nice to new backend implementers (who might not be familiar with TVM internals) as to add convenient op level annotation and semi automatic fusion mechanism etc for them, I don't think it is reasonable to expect them to figure out how to handle more complicated but often common cases (like fusion) and everything else on their own. Hope this make sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another usage scenario which I think is going to be common is translation from quantized Relay models. It would be great to add an example of translating QNN subgraphs to backend implementations, for example. Without it, it is not obvious how to go about it.
Since DNNL has quantization support and everyone can use it, it would serve as a good example and test case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While I agree with you that it's fine to handle fusion in this DNNL codegen, I also agree with @zhiics that the current implementation is a bit too ad-hoc even it's only used for demo purpose for now. As you have implemented, MKL DNN uses set_post_ops
to attach ops to be fused. I think this part could be more general. For example:
if call == "relu":
visit(arg)
if this->curr_layer == "conv2d":
generate_post_ops(call)
else:
generate_a_layer(call)
In this way, the codegen is able to deal with all MKL DNN supported conv2d fusion (conv2d, conv2d+add, conv2d+add+relu). We could still put heuristic pattern annotations to the annotator and improve it gradually. I like the one you made for conv2d+bias+relu in this PR, for instance.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, this is my minimal effort way to detect only the pattern I care about. Will think about how to make it more general.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can go ahead and implement this, but that would duplicate pattern matching logic that I already have in my python annotator. That sounds bad and it would become a perfect anti-example mentioned in the RFC below :)
I think I should close this one and wait for a better solution to be ready. I will wait for your input for now @comaniac @zhiics
https://discuss.tvm.ai/t/rfc-external-codegen-defining-composite-relay-operators/5470/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I had a brief discussion with @u99127 before. I will read the discussion more carefully and probably we can discuss from there and try to have some consensus on a design/implementation. Sorry for being late/slow because I am on vacation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can also leave the current dumb implementation as it is, with the understanding that
- This is a temporary solution
- It will serve as a concrete motivation and test case for validating a more general mechanism to be introduced
Trying to be a bit more clever and duplicating an entire state machine logic here do not seem worth it to me anymore. Either way I'm fine.
@zhiics I'm not trying to make DNNL backend more feature complete. I want to add examples and test cases of typical usage scenarios that most backend implementers are likely to encounter. We talked on the forum that fusion is already possible with manual annotation. But there is no example which demonstrates that. This PR fill this gap. |
I add a link below where I clarified my intention. Hopefully this clears up some confusion. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR. Overall looks good to me but just some miner points. Please see comments for details.
}; | ||
|
||
Output ret; | ||
if (auto conv_call = DetectFusedConv2DBiasReLU(call)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While I agree with you that it's fine to handle fusion in this DNNL codegen, I also agree with @zhiics that the current implementation is a bit too ad-hoc even it's only used for demo purpose for now. As you have implemented, MKL DNN uses set_post_ops
to attach ops to be fused. I think this part could be more general. For example:
if call == "relu":
visit(arg)
if this->curr_layer == "conv2d":
generate_post_ops(call)
else:
generate_a_layer(call)
In this way, the codegen is able to deal with all MKL DNN supported conv2d fusion (conv2d, conv2d+add, conv2d+add+relu). We could still put heuristic pattern annotations to the annotator and improve it gradually. I like the one you made for conv2d+bias+relu in this PR, for instance.
dd7046b
to
3dbce0f
Compare
3dbce0f
to
af627a9
Compare
As #4771 has been merged, we can revisit this PR for DNNL fuse patterns. |
yes I want to update this PR but we don't have a way to hook |
This PR contains
The result of partitioning mobilenet is dumped here.
please review @zhiics @comaniac