Skip to content

Commit

Permalink
Merge pull request huggingface#18 from microsoft/sukha/permit-ddp-on-…
Browse files Browse the repository at this point in the history
…ortmodule

Permit DDP wrapping for ORTModule
  • Loading branch information
Suffian Khan authored Jul 9, 2021
2 parents 25ea1d2 + 55c79d3 commit c1b9595
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion src/transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -921,7 +921,12 @@ def _wrap_model(self, model, training=True):

# train/eval could be run multiple-times - if already wrapped, don't re-wrap it again
if unwrap_model(model) is not model:
return model
if self.args.ort:
from torch_ort import ORTModule
if type(model) is not ORTModule:
return model
else:
return model

# Mixed precision training with apex (torch < 1.6)
if self.use_apex and training:
Expand Down

0 comments on commit c1b9595

Please sign in to comment.