Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【PIR OpTest Fix No.28】 fix test_fused_adam_op #62770

Merged
merged 4 commits into from
Mar 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions paddle/fluid/pir/dialect/op_generator/ops_api_gen.py
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,7 @@
'dpsgd',
'embedding_grad_sparse',
'ftrl',
'fused_adam_',
'fused_batch_norm_act_',
'fused_bn_add_activation_',
'fused_elemwise_add_activation',
Expand Down
2 changes: 1 addition & 1 deletion paddle/fluid/pir/dialect/operator/ir/ops.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -749,7 +749,7 @@
kernel :
func : fused_adam
data_type : params
optional : skip_update, master_params
optional : skip_update, master_params, master_params_out
inplace : (params -> params_out), (moments1 -> moments1_out), (moments2 -> moments2_out), (beta1_pows -> beta1_pows_out), (beta2_pows -> beta2_pows_out), (master_params -> master_params_out)

- op : fused_batch_norm_act
Expand Down
9 changes: 9 additions & 0 deletions paddle/phi/api/yaml/op_compat.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -1254,6 +1254,15 @@
data_type : float
support_tensor : true

- op : fused_adam_(fused_adam)
inputs :
{params : Params, grads : Grads, learning_rate : LearningRate, moments1 : Moments1,
moments2 : Moments2, beta1_pows : Beta1Pows, beta2_pows : Beta2Pows, master_params : MasterParams,
skip_update : SkipUpdate}
outputs :
{params_out : ParamsOut, moments1_out : Moments1Out, moments2_out : Moments2Out,
beta1_pows_out : Beta1PowsOut, beta2_pows_out : Beta2PowsOut, master_params_out : MasterParamsOut}

- op : fused_attention
backward: fused_attention_grad
inputs:
Expand Down
1 change: 1 addition & 0 deletions test/white_list/pir_op_test_white_list
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@ test_fold_op
test_frame_op
test_ftrl_op
test_full_like_op
test_fused_adam_op
test_fused_attention_op
test_fused_attention_op_api
test_fused_bias_dropout_residual_layer_norm_op
Expand Down