-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add GaussianNLLLoss API. #50843
Add GaussianNLLLoss API. #50843
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
@GGBond8488 你好,重新提交了pr |
python/paddle/nn/functional/loss.py
Outdated
|
||
# Entries of var must be non-negative | ||
# print(paddle.any(var < 0)) | ||
# if paddle.any(var < 0): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
此处静态图时判断var返回为LoDTensor
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这是一个比较老的概念,但是应该不会影响这一段的检查
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
可能是我对静态图不了解,paddle.any(var < 0)
是否在静态图时可能输出的是节点信息?我在测试静态图时这段检查会进入到判断语句内层返回Error。相同代码的动态图可以通过测试。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
测试返回的错误代码
if paddle.any(var < 0): \ print('var',var) \ print(paddle.any(var < 0)) \ raise ValueError("var has negative entry/entries")
输出的结果:
var var Var : LOD_TENSOR.shape(10, 2).dtype(float32).stop_gradient(True) E.E var any_1.tmp_0 : LOD_TENSOR.shape(1,).dtype(bool).stop_gradient(False) var var Var : LOD_TENSOR.shape(10, 2).dtype(float32).stop_gradient(True) var any_3.tmp_0 : LOD_TENSOR.shape(1,).dtype(bool).stop_gradient(False)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这是一个比较老的概念,但是应该不会影响这一段的检查
尝试了cond()
函数,发现在组网时都会调用cond()
函数里设计的tru_func()
和false_func()
并抛出函数内的错误。
然后我找了一下其他人是否有在lossfunc中使用到raise ValueError,可以发现在python\paddle\nn\functional\loss.py
中的triplet_margin_with_distance_loss
内line 3526
有关于节点内参数的值的判断。如您所说的不会影响到检查。
但是使用Print()
进行检查发现节点内的数据无误。
print(paddle.any(var < 0))
var_res = paddle.static.Print(paddle.any(var < 0))
# if paddle.any(var < 0):
# raise ValueError("var has negative entry/entries")
================================================
Variable: any_1.tmp_0
- lod: {}
- place: Place(cpu)
- shape: [1]
- layout: NCHW
- dtype: bool
- data: [0]
Variable: any_3.tmp_0
- lod: {}
- place: Place(cpu)
- shape: [1]
- layout: NCHW
- dtype: bool
- data: [0]
进程已结束,退出代码0
var any_0.tmp_0 : LOD_TENSOR.shape(1,).dtype(bool).stop_gradient(False)
var any_2.tmp_0 : LOD_TENSOR.shape(1,).dtype(bool).stop_gradient(False)
I0227 16:33:39.522938 20040 interpretercore.cc:273] New Executor is Running.
Ran 1 test in 0.204s
OK
如果不添加判断的代码,则可以正常通过测试
如果添加了判断代码,仍会进入到判断语句中返回错误。
Error
Traceback (most recent call last):
File "D:\PyWorkspace\Paddle\python\paddle\fluid\tests\unittests\test_gaussian_nll_loss.py", line 130, in test_static_case
out1,var_res = F.gaussian_nll_loss(
File "D:\Anaconda\envs\paddle_devcpu\lib\site-packages\paddle\nn\functional\loss.py", line 4003, in gaussian_nll_loss
raise ValueError("var has negative entry/entries")
ValueError: var has negative entry/entries
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
如果语句修改为
if not paddle.all(var > 0):
raise ValueError("var has negative entry/entries")
也可以通过测试。。但是会不会还是判断的是节点
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
你的判断是正确的,在静态图里面,组网阶段是没有办法拿到var的数据的,所以这个检查在静态图下会报错,现在有两种解决方案:
- 增加c++ 层 的kernel,在kernel层实现计算,并实现对数据的检查,kernel运行在计算阶段,可以拿到对应的数据
- https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/static/nn/control_flow.py#L43,利用这里 的Assert OP进行数值判断和提示
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这是一个比较老的概念,但是应该不会影响这一段的检查
尝试了
cond()
函数,发现在组网时都会调用cond()
函数里设计的tru_func()
和false_func()
并抛出函数内的错误。 然后我找了一下其他人是否有在lossfunc中使用到raise ValueError,可以发现在python\paddle\nn\functional\loss.py
中的triplet_margin_with_distance_loss
内line 3526
有关于节点内参数的值的判断。如您所说的不会影响到检查。 但是使用Print()
进行检查发现节点内的数据无误。print(paddle.any(var < 0)) var_res = paddle.static.Print(paddle.any(var < 0)) # if paddle.any(var < 0): # raise ValueError("var has negative entry/entries") ================================================ Variable: any_1.tmp_0 - lod: {} - place: Place(cpu) - shape: [1] - layout: NCHW - dtype: bool - data: [0] Variable: any_3.tmp_0 - lod: {} - place: Place(cpu) - shape: [1] - layout: NCHW - dtype: bool - data: [0] 进程已结束,退出代码0 var any_0.tmp_0 : LOD_TENSOR.shape(1,).dtype(bool).stop_gradient(False) var any_2.tmp_0 : LOD_TENSOR.shape(1,).dtype(bool).stop_gradient(False) I0227 16:33:39.522938 20040 interpretercore.cc:273] New Executor is Running. Ran 1 test in 0.204s OK
如果不添加判断的代码,则可以正常通过测试 如果添加了判断代码,仍会进入到判断语句中返回错误。
Error Traceback (most recent call last): File "D:\PyWorkspace\Paddle\python\paddle\fluid\tests\unittests\test_gaussian_nll_loss.py", line 130, in test_static_case out1,var_res = F.gaussian_nll_loss( File "D:\Anaconda\envs\paddle_devcpu\lib\site-packages\paddle\nn\functional\loss.py", line 4003, in gaussian_nll_loss raise ValueError("var has negative entry/entries") ValueError: var has negative entry/entries
这里的cond,控制流会对控制流的分支都进行组网,所以会发现true_fn以及false_fn都会抛出异常
而Print()实际上也是一个op,也在进行组网,只是在计算阶段会执行和打印
python/paddle/nn/functional/loss.py
Outdated
'gaussian_nll_loss', | ||
) | ||
condition = paddle.all(var > 0) | ||
Assert(condition) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
补充Assert的参数,把var的名字的数据填进去,提示更友好一点
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
python/paddle/nn/layer/loss.py
Outdated
loss = F.multi_label_soft_margin_loss(input, target, var, reduction='none') | ||
print(loss) | ||
|
||
loss = F.multi_label_soft_margin_loss(input, target, var, reduction='mean') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
示例代码不对
python/paddle/nn/functional/loss.py
Outdated
loss = F.multi_label_soft_margin_loss(input, target, var, reduction='none') | ||
print(loss) | ||
|
||
loss = F.multi_label_soft_margin_loss(input, target, var, reduction='mean') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
示例代码不对
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
抱歉,git流程还是不熟悉,之前的例子丢失了,我现在来补充
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
return [np.mean(loss)] | ||
|
||
|
||
class TestGaussianNLLLossAPI(unittest.TestCase): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
其他的没问题了,这个单测不同的场景分写成不同的test_case吧(把这些用例写到单独的class里面),方便后续直接定位是哪个case不通过。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
code is fine, but rfc need to be modified in chapter 5: |
Yes, sorry for my carelessness. |
@Atlantisming 可以提个 PR 修改下 RFC |
output (Tensor): If ``reduction`` is ``'none'``, the shape of output is same as ``input`` , else the shape of output is [1]. | ||
|
||
Examples:: | ||
.. code-block:: python |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A callable object of GaussianNLLLoss. | ||
|
||
Examples:: | ||
.. code-block:: python |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
同样的,下方空一行吧,保持一致
2cb2432
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
python/paddle/nn/layer/loss.py
Outdated
class GaussianNLLLoss(Layer): | ||
r"""Gaussian negative log likelihood loss. | ||
|
||
The targets are treated as samples from Gaussian distributions with |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
英文文档这里的描述参考一下其他loss的描述,重新组织一下文案,可以参考 BCELoss,上面functional下的也一样
python/paddle/nn/functional/loss.py
Outdated
The targets are treated as samples from Gaussian distributions with | ||
expectations and variance predicted by the neural network. For a | ||
The ``label`` is treated as samples from Gaussian distributions with | ||
expectations ``input`` and ``variance`` predicted by the neural network. For a | ||
``label`` tensor modelled as having Gaussian distribution with a tensor | ||
of expectations ``input`` and a tensor of positive ``variance`` the loss is: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这一段用自己的话描述一下吧,不要直接借鉴
python/paddle/nn/functional/loss.py
Outdated
|
||
Gaussian negative log likelihood loss among ``input``, ``variance`` and | ||
``label``. Note that the ``label`` is treated as samples from Gaussian distributions. | ||
One of the interpretations is this class is used to train a neural network predicts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里应该是function 不是class, One of the interpretations 是不是去掉更好点 @sunzhongkai588 再看看
同时请修复下 PR-CI-Codestyle-Check 失败的问题 |
@@ -17,7 +17,7 @@ | |||
import numpy as np | |||
|
|||
import paddle | |||
import paddle.fluid.core as core | |||
import paddle.fluid as core |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
variance (Tensor): tensor of positive variance(s), :math:`(N, *)` or :math:`(*)`, same shape as the input, or same shape as the input but | ||
with one dimension equal to 1, or same shape as the input but with one fewer | ||
dimension (to allow for broadcasting). One for each of the expectations | ||
in the input (heteroscedastic), or a single one (homoscedastic), available dtype is float32, float64. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
python/paddle/nn/layer/loss.py
Outdated
class GaussianNLLLoss(Layer): | ||
r"""Create a callable object of 'GaussianNLLLoss' to calculate Gaussian negative log likelihood loss. | ||
|
||
This class create a callable object of Gaussian negative log likelihood loss among ``input``,``variance`` and |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
variance
前空一格
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
New features
PR changes
APIs
Describe
rfc 文档链接:PaddlePaddle/community#372
中文文档链接:PaddlePaddle/docs#5623