Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Typing][B-18] Add type annotations for python/paddle/distribution/laplace.py #65784

Merged
merged 4 commits into from
Jul 13, 2024

Conversation

ooooo-create
Copy link
Contributor

PR Category

User Experience

PR Types

Improvements

Description

类型标注:
- python/paddle/distribution/laplace.py

Related links

@SigureMo @megemini

Copy link

paddle-bot bot commented Jul 8, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added the contributor External developers label Jul 8, 2024
@luotao1 luotao1 added the HappyOpenSource 快乐开源活动issue与PR label Jul 8, 2024

def __init__(self, loc, scale):
def __init__(self, loc: Numberic, scale: Numberic) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

float | Tensor

Numberic: TypeAlias = Union[int, float, complex, np.number, "Tensor"] 中的 complex 貌似不能支持?

In [4]:             >>> import paddle
   ...:             >>> paddle.seed(2023)
   ...:             >>> m = paddle.distribution.Laplace(1+1j, 2+1j)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[4], line 3
      1 import paddle
      2 paddle.seed(2023)
----> 3 m = paddle.distribution.Laplace(1+1j, 2+1j)

File ~/venv38dev/lib/python3.8/site-packages/paddle/distribution/laplace.py:60, in Laplace.__init__(self, loc, scale)
     56 def __init__(self, loc, scale):
     57     if not isinstance(
     58         loc, (numbers.Real, framework.Variable, paddle.pir.Value)
     59     ):
---> 60         raise TypeError(
     61             f"Expected type of loc is Real|Variable, but got {type(loc)}"
     62         )
     64     if not isinstance(
     65         scale, (numbers.Real, framework.Variable, paddle.pir.Value)
     66     ):
     67         raise TypeError(
     68             f"Expected type of scale is Real|Variable, but got {type(scale)}"
     69         )

TypeError: Expected type of loc is Real|Variable, but got <class 'complex'>

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

是的,已修改~

Comment on lines 61 to 62
loc: Numberic
scale: Numberic
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
loc: Numberic
scale: Numberic
loc: Tensor
scale: Tensor

实例化之后,loc scale 的值都转换为了 Tensor

python/paddle/distribution/laplace.py Outdated Show resolved Hide resolved
python/paddle/distribution/laplace.py Outdated Show resolved Hide resolved
python/paddle/distribution/laplace.py Outdated Show resolved Hide resolved
python/paddle/distribution/laplace.py Outdated Show resolved Hide resolved
python/paddle/distribution/laplace.py Outdated Show resolved Hide resolved
python/paddle/distribution/laplace.py Outdated Show resolved Hide resolved
Copy link
Member

@SigureMo SigureMo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTMeow 🐾

@SigureMo SigureMo merged commit fe24254 into PaddlePaddle:develop Jul 13, 2024
30 of 32 checks passed
lixcli pushed a commit to lixcli/Paddle that referenced this pull request Jul 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers HappyOpenSource 快乐开源活动issue与PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants