Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Phi] Move batch size like infershape into phi #40847

Conversation

chenwhql
Copy link
Contributor

@chenwhql chenwhql commented Mar 23, 2022

PR types

Function optimization

PR changes

OPs

Describe

[Phi] Move batch size like infershape into phi

out_batch_size_dim));

output_dim[out_batch_size_dim] = x.dims()[x_batch_size_dim];
out->set_dims(output_dim);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里没有set dtype?

Copy link
Contributor Author

@chenwhql chenwhql Mar 24, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里不需要,set了会出错,所以去掉了,dtype的set需要在后面具体的函数中

@@ -138,6 +138,59 @@ void CastInferMeta(const MetaTensor& x, DataType out_dtype, MetaTensor* out) {
out->set_layout(x.layout());
}

void BatchSizeLikeInferMeta(const MetaTensor& x,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个函数顺序在CastInferMeta前

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx

@chenwhql chenwhql merged commit 6d3db9c into PaddlePaddle:develop Mar 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants