Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Inference] add add_shadow_output_after_dead_parameter_pass #68476

Conversation

yuanlehome
Copy link
Contributor

@yuanlehome yuanlehome commented Sep 26, 2024

PR Category

Inference

PR Types

Others

Description

pcard-71500

为什么需要这么一个Pass?

为了适配paddle.nn.Layer.register_buffer的功能。register_buffer会在模型结构中产出一个“dead”的builtin.parameter_op,也就是说它的输出没有被使用到(这是因为动转静保存模型时并不会对这类op做裁剪PR#68442),在IR中我们认为其是dead code。如果不加以处理,在推理的IR优化过程中会被dead_code_elimination_pass给删除掉,这是不符合预期的行为。因此添加add_shadow_output_after_dead_parameter_pass,在这个dead op后添加一个shadow_output op用来标识parameter_op的输出有被使用到,因此就不会被dead_code_elimination_pass给误删除了。

@yuanlehome yuanlehome force-pushed the add_shadow_output_after_dead_parameter_pass branch from f380c85 to bf44ca0 Compare September 26, 2024 08:46
Copy link
Contributor

@vivienfanghuagood vivienfanghuagood left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

一定需要新增shadow_output才能避免被DCE裁剪吗?

@yuanlehome
Copy link
Contributor Author

一定需要新增shadow_output才能避免被DCE裁剪吗?

目前没有想到更好的解决思路。

Comment on lines +23 to +37
class AddShadowOutputAfterDeadParameterPattern
: public pir::OpRewritePattern<pir::ParameterOp> {
public:
using pir::OpRewritePattern<pir::ParameterOp>::OpRewritePattern;
bool MatchAndRewrite(
pir::ParameterOp op,
pir::PatternRewriter& rewriter) const override { // NOLINT
if (!op->use_empty()) {
return false;
}
rewriter.SetInsertionPointToBlockEnd(op->GetParent());
rewriter.Build<pir::ShadowOutputOp>(op->result(0), op.param_name());
return true;
}
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这样是不是等于所有ParameterOp都不会DCE了?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不是,推理这边是把这个pass放在所有pass之前执行的,只会对原始模型里的ParameterOp不DCE

@yuanlehome yuanlehome merged commit 9f80c7f into PaddlePaddle:develop Sep 27, 2024
26 of 27 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants