-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[oneDNN] Pool softmax and LRN access to cache optimized #32922
[oneDNN] Pool softmax and LRN access to cache optimized #32922
Conversation
LRN added - lint
Thanks for your contribution! |
class LRNMKLDNNHandler : public platform::MKLDNNHandlerT<T, mkldnn::lrn_forward, | ||
mkldnn::lrn_backward> { | ||
public: | ||
LRNMKLDNNHandler(const paddle::framework::ExecutionContext& ctx, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LRNMKLDNNHandler(const paddle::framework::ExecutionContext& ctx, | |
LRNMKLDNNHandler(const framework::ExecutionContext& ctx, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
const float k = ctx.Attr<float>("k"); | ||
bool is_test = ctx.Attr<bool>("is_test"); | ||
|
||
auto dims = paddle::framework::vectorize(input->dims()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
auto dims = paddle::framework::vectorize(input->dims()); | |
auto dims = framework::vectorize(input->dims()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
} | ||
} | ||
|
||
LRNMKLDNNHandler(const paddle::framework::ExecutionContext& ctx, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LRNMKLDNNHandler(const paddle::framework::ExecutionContext& ctx, | |
LRNMKLDNNHandler(const framework::ExecutionContext& ctx, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
} | ||
|
||
LRNMKLDNNHandler(const paddle::framework::ExecutionContext& ctx, | ||
const platform::MKLDNNDeviceContext& dev_ctx, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const platform::MKLDNNDeviceContext& dev_ctx, | |
const MKLDNNDeviceContext& dev_ctx, |
because you have "using paddle::platform::MKLDNNDeviceContext;" in line 30
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
const float beta = ctx.Attr<float>("beta"); | ||
const float k = ctx.Attr<float>("k"); | ||
|
||
auto dims = paddle::framework::vectorize<int64_t>(in_x->dims()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
auto dims = paddle::framework::vectorize<int64_t>(in_x->dims()); | |
auto dims = framework::vectorize<int64_t>(in_x->dims()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@luotao1 Could you please start your review? |
PR types
Function optimization
PR changes
OPs
Describe
This PR reduces some mutexes and references to oneDNN cache for three operators(more to come). General idea is that instead of relying on cache to get FWD_PD when BWD_PD is to be created we recreate FWD_PD from scratch in BWD_PD if the one in cache does not exist.