Skip to content

Commit

Permalink
Compile FFT FSDP (#1573)
Browse files Browse the repository at this point in the history
Co-authored-by: Felipe Mello <felipemello@fb.com>
  • Loading branch information
felipemello1 and Felipe Mello authored Sep 13, 2024
1 parent 4d3e48a commit ee343e6
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions recipes/full_finetune_distributed.py
Original file line number Diff line number Diff line change
Expand Up @@ -362,6 +362,9 @@ def _setup_model(
with training.set_default_dtype(self._dtype), torch.device("meta"):
model = config.instantiate(cfg_model)

if self._compile:
training.compile_model(model, verbose=self._is_rank_zero)

# We currently have two versions of activation checkpointing in this recipe
# for testing and BC purposes. ``enable_activation_checkpointing`` controls
# the older version of AC and this behavior is unchanged
Expand Down

0 comments on commit ee343e6

Please sign in to comment.