-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add jacobian and hessian #53331
Add jacobian and hessian #53331
Conversation
already. 2. support non-inplace math operation via magical method overwriting.
…ograd.functional.Jacobian/Hessian
2. increase TIMEOUT to 100
2. refine docstring
你的PR提交成功,感谢你对开源项目的贡献! |
❌ The PR is not created using PR's template. You can refer to this Demo. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
主要是一些格式对齐问题
python/paddle/autograd/autograd.py
Outdated
The ``xs`` tuples are identical in one-to-one correspondence. | ||
|
||
- When ``batch_axis=None``, only 0-dimensional Tensor or 1-dimensional Tensor is | ||
supported, assuming the shape of ``xs`` is ``[N, ]``, the shape of ``ys`` is |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
python/paddle/autograd/autograd.py
Outdated
Returns: | ||
|
||
Union[Tuple[Tuple[Jacobian, ...], ...], Tuple[Jacobian, ...], Jacobian]: Jacobian(s) of ys | ||
deriveted from xs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
python/paddle/autograd/autograd.py
Outdated
``(([M1, M1], [M1, M2]), ([M2, M1], [M2, M2]))`` | ||
|
||
- When ``batch_axis=None``, only 0-dimensional Tensor or 1-dimensional Tensor is | ||
supported, assuming that the shape of ``xs`` is ``[N, ]``, and the shape of ``ys`` is ``[ ]``(0-dimensional Tensor), the final output is a single Hessian matrix whose shape is ``[N, N]``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
开头和when对齐,而不是和 -
对齐,下同
python/paddle/autograd/autograd.py
Outdated
Returns: | ||
|
||
Union[Tuple[Tuple[Hessian, ...], ...], Tuple[Hessian, ...], Hessian]: Hessian(s) of ys | ||
deriveted from xs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
和上一行开头对齐
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for set_tests_properties(test_autograd_dynamic PROPERTIES TIMEOUT 100)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HydrogenSulfate 文档问题后续提pr修复
python/paddle/autograd/autograd.py
Outdated
): | ||
xs_grad = xs_grad[0] | ||
else: | ||
xs_grad = paddle.incubate.autograd.grad(ys, xs, v) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
正式接口的代码不应依赖incubate里的代码,
假设incubate里的代码全部删除或者发生修改,正式接口的代码需要能正常运行。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
python/paddle/autograd/autograd.py
Outdated
return xs | ||
|
||
|
||
def _grad(ys, xs, v=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个函数名比较短,是否能在函数名称上体现一下跟grad的区别?
担心被认为是基础接口被其他代码调用
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
5ee5a7d
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
文档问题后续修复
PR types
New features
PR changes
APIs
Description
Pcard-66961
Add 2 high-order differential API:
paddle.autograd.jacobian
与paddle.autograd.hessian
,and related unitest file.Move paddle.incubate.autograd.Jacobian/Hessian to paddle.incubate.autograd.functional.Jacobian/Hessian