-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Typing][C-74] Add type annotations for python/paddle/incubate/nn/functional/fused_matmul_bias.py
#66656
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
bias: Tensor, | ||
trans_x: bool = False, | ||
trans_y: bool = False, | ||
activation: str | None = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
activation (str, optional): Activation function, Currently, the available activation functions are limited to "gelu" (Gaussian Error Linear Unit) and "relu" (Rectified Linear Unit). These activation functions are applied to the output of the bias add. Default: None.
可以用 Literal
限制一下 activation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
感谢提醒,已经提交commit修改
python/paddle/incubate/nn/functional/fused_matmul_bias.py
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
…nctional/fused_matmul_bias.py` (PaddlePaddle#66656)
…nctional/fused_matmul_bias.py` (PaddlePaddle#66656)
PR Category
User Experience
PR Types
Improvements
Description
类型标注:
Related links
@SigureMo @megemini