Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Typing][B-94] Add type annotations for python/paddle/hapi/dynamic_flops.py #67204

Merged
merged 6 commits into from
Aug 12, 2024

Conversation

enkilee
Copy link
Contributor

@enkilee enkilee commented Aug 8, 2024

PR Category

User Experience

PR Types

Improvements

Description

Copy link

paddle-bot bot commented Aug 8, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@luotao1 luotao1 added contributor External developers HappyOpenSource 快乐开源活动issue与PR labels Aug 8, 2024
python/paddle/hapi/dynamic_flops.py Outdated Show resolved Hide resolved
python/paddle/hapi/dynamic_flops.py Outdated Show resolved Hide resolved
python/paddle/hapi/dynamic_flops.py Outdated Show resolved Hide resolved
Copy link
Contributor

@megemini megemini left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

另外,改完之后要看一下 CI 的示例代码运行结果 ~

目前示例代码输出结果不一致的情况,有可能是 numpy 版本不一致导致的,可以考虑 SKIP 掉 table 的打印 ~ 原示例:

            >>> FLOPs = paddle.flops(lenet,
            ...                      [1, 1, 28, 28],
            ...                      custom_ops= {nn.LeakyReLU: count_leaky_relu},
            ...                      print_detail=True)
            >>> print(FLOPs)
            <class 'paddle.nn.layer.conv.Conv2D'>'s flops has been counted
            <class 'paddle.nn.layer.activation.ReLU'>'s flops has been counted
            Cannot find suitable count function for <class 'paddle.nn.layer.pooling.MaxPool2D'>. Treat it as zero FLOPs.
            <class 'paddle.nn.layer.common.Linear'>'s flops has been counted
            +--------------+-----------------+-----------------+--------+--------+
            |  Layer Name  |   Input Shape   |   Output Shape  | Params | Flops  |
            +--------------+-----------------+-----------------+--------+--------+
            |   conv2d_0   |  [1, 1, 28, 28] |  [1, 6, 28, 28] |   60   | 47040  |
            |   re_lu_0    |  [1, 6, 28, 28] |  [1, 6, 28, 28] |   0    |   0    |
            | max_pool2d_0 |  [1, 6, 28, 28] |  [1, 6, 14, 14] |   0    |   0    |
            |   conv2d_1   |  [1, 6, 14, 14] | [1, 16, 10, 10] |  2416  | 241600 |
            |   re_lu_1    | [1, 16, 10, 10] | [1, 16, 10, 10] |   0    |   0    |
            | max_pool2d_1 | [1, 16, 10, 10] |  [1, 16, 5, 5]  |   0    |   0    |
            |   linear_0   |     [1, 400]    |     [1, 120]    | 48120  | 48000  |
            |   linear_1   |     [1, 120]    |     [1, 84]     | 10164  | 10080  |
            |   linear_2   |     [1, 84]     |     [1, 10]     |  850   |  840   |
            +--------------+-----------------+-----------------+--------+--------+
            Total Flops: 347560     Total Params: 61610
            347560

改为

            >>> FLOPs = paddle.flops(lenet,
            ...                      [1, 1, 28, 28],
            ...                      custom_ops= {nn.LeakyReLU: count_leaky_relu},
            ...                      print_detail=True)
            >>> # doctest: +SKIP('numpy print with different version')
            <class 'paddle.nn.layer.conv.Conv2D'>'s flops has been counted
            <class 'paddle.nn.layer.activation.ReLU'>'s flops has been counted
            Cannot find suitable count function for <class 'paddle.nn.layer.pooling.MaxPool2D'>. Treat it as zero FLOPs.
            <class 'paddle.nn.layer.common.Linear'>'s flops has been counted
            +--------------+-----------------+-----------------+--------+--------+
            |  Layer Name  |   Input Shape   |   Output Shape  | Params | Flops  |
            +--------------+-----------------+-----------------+--------+--------+
            |   conv2d_0   |  [1, 1, 28, 28] |  [1, 6, 28, 28] |   60   | 47040  |
            |   re_lu_0    |  [1, 6, 28, 28] |  [1, 6, 28, 28] |   0    |   0    |
            | max_pool2d_0 |  [1, 6, 28, 28] |  [1, 6, 14, 14] |   0    |   0    |
            |   conv2d_1   |  [1, 6, 14, 14] | [1, 16, 10, 10] |  2416  | 241600 |
            |   re_lu_1    | [1, 16, 10, 10] | [1, 16, 10, 10] |   0    |   0    |
            | max_pool2d_1 | [1, 16, 10, 10] |  [1, 16, 5, 5]  |   0    |   0    |
            |   linear_0   |     [1, 400]    |     [1, 120]    | 48120  | 48000  |
            |   linear_1   |     [1, 120]    |     [1, 84]     | 10164  | 10080  |
            |   linear_2   |     [1, 84]     |     [1, 10]     |  850   |  840   |
            +--------------+-----------------+-----------------+--------+--------+
            Total Flops: 347560     Total Params: 61610
            >>> # doctest: -SKIP
            >>> print(FLOPs)
            347560

也就是说,只检查 FLOPs 的数值正确性 ~

另外,typing 有可能有需要 ignore 的地方,等 CI 结果再说吧~

python/paddle/hapi/dynamic_flops.py Outdated Show resolved Hide resolved
python/paddle/hapi/dynamic_flops.py Outdated Show resolved Hide resolved
Copy link
Contributor

@megemini megemini left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ~

Copy link
Member

@SigureMo SigureMo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTMeow 🐾

@luotao1 luotao1 merged commit 17ec28c into PaddlePaddle:develop Aug 12, 2024
31 checks passed
Jeff114514 pushed a commit to Jeff114514/Paddle that referenced this pull request Aug 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers HappyOpenSource 快乐开源活动issue与PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants