Skip to content

Commit

Permalink
Fix fp16 (#6543) (#6544)
Browse files Browse the repository at this point in the history
Signed-off-by: MaximumEntropy <sandeep.subramanian.1@umontreal.ca>
Co-authored-by: Sandeep Subramanian <sandeep.subramanian.1@umontreal.ca>
  • Loading branch information
github-actions[bot] and MaximumEntropy authored May 4, 2023
1 parent 46bc357 commit f495887
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions examples/nlp/language_modeling/megatron_gpt_eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -196,6 +196,8 @@ def main(cfg) -> None:
pretrained_cfg.activations_checkpoint_granularity = None
pretrained_cfg.activations_checkpoint_method = None
pretrained_cfg.precision = trainer.precision
if trainer.precision == "16":
pretrained_cfg.megatron_amp_O2 = False
model = MegatronGPTModel.restore_from(
restore_path=cfg.gpt_model_file,
trainer=trainer,
Expand Down

0 comments on commit f495887

Please sign in to comment.