Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

paddle.norm 与 torch.norm 计算行为不一致 #62240

Closed
JunnYu opened this issue Feb 29, 2024 · 1 comment
Closed

paddle.norm 与 torch.norm 计算行为不一致 #62240

JunnYu opened this issue Feb 29, 2024 · 1 comment
Assignees
Labels

Comments

@JunnYu
Copy link
Member

JunnYu commented Feb 29, 2024

bug描述 Describe the Bug

这个PR的修改存在问题 #60070
torch在下面的例子中使用的是vector norm, 在paddle<=2.6.0 之前是一致的,但是到了develop的时候行为不一致。
torch 2.1.0

import torch, paddle
torch.cuda.manual_seed(42)
a = torch.randn([2, 24, 24, 4], device='cuda:0')
# torch.norm求的是vector norm
vector_o1 = torch.norm(a, p=2, dim=(1, 2), keepdim=True)
b = paddle.to_tensor(a.cpu().numpy())
# paddle.norm求的是matrix norm
o2 = paddle.norm(b, p=2, axis=(1, 2), keepdim=True)
print(vector_o1, o2)
# tensor([[[[24.9556, 24.9024, 24.0531, 22.3612]]],


#         [[[22.2263, 22.8545, 23.2422, 24.4033]]]], device='cuda:0') 
# Tensor(shape=[2, 1, 1, 4], dtype=float32, place=Place(gpu:0), stop_gradient=True,
#        [[[[9.78923035 , 10.33647346, 9.06337070 , 8.42760468 ]]],


#         [[[9.00046349 , 9.06209278 , 8.59618092 , 9.77013969 ]]]])

其他补充信息 Additional Supplementary Information

No response

@zhwesky2010
Copy link
Contributor

zhwesky2010 commented Mar 1, 2024

@JunnYu 你好,paddle.linalg.norm/paddle.norm对标的是 torch.linalg.norm,不是torch.normtorch.linalg.norm计算的也是矩阵范数,两者是一致的:
infoflow 2024-03-01 15-03-28

torch.norm是torch早期设计不规范的API,torch计划废弃,不建议使用,也不保证行为正确性:https://pytorch.org/docs/stable/generated/torch.norm.html#torch-norm
infoflow 2024-03-01 15-07-20

这几个API之间对应关系为:(不建议使用最后一个)

paddle.norm == paddle.linalg.norm == torch.linalg.norm != torch.norm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants