-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add gradient kernel of det op and slogdet op #36013
add gradient kernel of det op and slogdet op #36013
Conversation
Thanks for your contribution! |
" input tensor's, but here differ %d", | ||
input_dims_size - grad->dims().size())); | ||
} else { | ||
// checked in forward, pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DO NOT change this comment in this PR, since we are chasing cherry-pick, change it in the future.
Empty block is not encouraged. You can put a PADDLE_ENFORCE
here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for op benchmark ci
* add gradient kernel of det op and slogdet op * fix CI APPROVAL problem
* add gradient kernel of det op and slogdet op * fix CI APPROVAL problem
PR types
New features
PR changes
OPs
Describe
增加
paddle.linalg.det
的反向代码实现。可逆矩阵参考论文:https://people.maths.ox.ac.uk/gilesm/files/NA-08-01.pdf
不可逆矩阵实现:对于
det
反向值全0,对于slogdet
反向值nan可逆矩阵主要实现
det_grad
:det_grad = (grad * det).unsqueeze(-1).unsqueeze(-2) * x.inverse().transpose(-2, -1)
slog_grad
:slogdet_grad = grad.unsqueeze(-1).unsqueeze(-2) * x.inverse().conj().transpose(-2, -1)
这些函数在paddle中均有对应的C++接口,因此直接调用对应的C++接口实现
注:
由于sloget的反向结果数值特别大(log的缘故,输入越小,反向越大),比如对于输入
x = [[0.01, 0.02], [0.03, 0.04]]
,反向x_grad = [[-200. 150.], [ 100. -50.]]
,因此在单测中为slog反向加了max_relative_error = 0.1
的限制。