Skip to content

Commit 1d7ebff

Browse files
nguyen599SunMarc
andauthored
Fix - remove deprecated args checking in deepspeed intergrations (#41282)
Remove deprecated args checking in deepspeed intergrations Signed-off-by: nguyen599 <pnvmanh2123@gmail.com> Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
1 parent 9d02602 commit 1d7ebff

File tree

1 file changed

+0
-5
lines changed

1 file changed

+0
-5
lines changed

src/transformers/integrations/deepspeed.py

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -356,11 +356,6 @@ def deepspeed_optim_sched(trainer, hf_deepspeed_config, args, num_training_steps
356356

357357
optimizer = None
358358
if "optimizer" in config:
359-
if args.optim == "adafactor":
360-
raise ValueError(
361-
"--adafactor was passed, but also found `optimizer` configured in the DeepSpeed config. "
362-
"Only one optimizer can be configured."
363-
)
364359
optimizer = DummyOptim(params=model_parameters)
365360
else:
366361
if hf_deepspeed_config.is_offload():

0 commit comments

Comments
 (0)